If it's not on-topic, it's in here.
Post a reply

again a bash script question

Sun Nov 13, 2016 8:45 am

What i do is this:
if lynx -dump -nolist some-http-webpage | grep "$TERM"
then
echo "found $TERM at some-http-webpage
fi

What i would like is to
store the result of lynx -dump in a file
and
if the file is elder than a week, use lynx again
else
grep through the file

thanks.
Last edited by nadir on Tue Nov 15, 2016 3:57 pm, edited 1 time in total.

Re: again a basch script question

Sun Nov 13, 2016 9:03 pm

Redirect the wget output to file, then grep the file for $TERM

Before you do that, use 'find -mtime <whatever a week ago looks like> -name same-file-as-above' and based on the result, either grep the file or wget it and grep it. You can use mtime to find files up to some time in the past or beyond some time in the past. Been awhile since we done this, huh?

Re: again a basch script question

Mon Nov 14, 2016 7:14 am

Lol. Yes, i do remember something with times. But to me anything scripting related has been a while.

In the meantime i ran in a different problem.
This does work
Code:
if lynx -dump -nolist http://main.mepis-deb.org/MX14packages.html | grep "$search_term"
then
FOUND="yes"
printf "$search_term found in community  repo http://main.mepis-deb.org/MX14packages.html \n"
fi

But as it's several repos, i put them in a list, and the result doesn't work anymore:
Code:
code]
repo_list=(
        http://teharris.net/NewPackages.html
        http://main.mepis-deb.org/MX15packages.html
        http://main.mepis-deb.org/MX14packages.html
        )


#----------------------------------------------------------

for i in ${repo_list[*]}
do
#       if echo "$i"
        if lynx -dump -nolist "$i" | grep "$searchterm"
        then
                FOUND="yes"
                printf "\n$searchterm found in $i\n"
        fi
done

Doesn't work mean:
a) it puts all output of lynx on the screen, not only a sucessful grep and
b) it always says "found at ", no matter what

Here are the two complete scripts:
working way:
http://sprunge.us/COIH
not working with array and loop
http://sprunge.us/HYKT

Usage is: sh repo_crawler <searchterm>

Re: again a basch script question

Mon Nov 14, 2016 8:23 pm

I found the problem:
1) variabe declared as search_term
search_term="$1"
and used as searchterm
2)
if lynx -dump -nolist "$i" | grep "$searchterm"

... now back to the file-date voodoo ...

Re: again a basch script question

Mon Nov 14, 2016 10:35 pm

Your github page has been taken over by someone else. I was going to find an example that we worked on together. Anyway...
Code:
# find files in current directory modified more than 30 days ago:
find . -type f -mtime +30

# find files in current directory modified in the last 10 days:
find . -type f -mtime -10


Code:
if find . -name myfile -mtime +7 ; then
    if lynx -blah > myfile ; then
        grep whatever
        print whatever
   fi
else
    grep whatever
    print whatever
fi

Re: again a basch script question

Tue Nov 15, 2016 10:22 am

I got problems with it. Might be me.
But i assume that
a) find . -name dragora_repos.txt -mtime +7
will indeed not find dragoras_repos.txt
in case it is elder than a week
but
b) using if find ... ; will return a true value anyway.

Code:
user$ find . -name dragora_repos.txt  -mtime +7
user$ echo $?
0
user$ rm dragora_repos.txt
user$ touch dragora_repos.txt
user$ if find . -name dragora_repos.txt -mtime +7; then echo "elder than a week"; fi
elder than a week
user$

Re: again a basch script question

Tue Nov 15, 2016 11:08 am

i did it like this:
if [ $(find . -name "$log_file" -mtime +7) ]

The resulting codeblock is:
Code:
#!/usr/bin/bash
### Trying to implement a text file to crawl_mepis.sh
### so we don't have to download all repos each time we run it
### shit knows if this works reliable
### hence it's actual tmp-name is: hack_crawl_mepis :-)

# 1) make sure search_term is given as argument
if [ "$#" -ne 1  ]
then
   printf "\nEnter program you search for as argument\n" 1>&2
   exit 1
fi
#----------------------------------------------------------
# 2) set variables
search_term="$1"
found=""
log_file="mepis_repos.txt"
#shorter URL list for testing purposes
repo_list=(     http://teharris.net/NewPackages.html
      http://main.mepis-deb.org/MX15packages.html
      http://main.mepis-deb.org/MX14packages.html )


# 3) if this is the first run, mepis_repo.txt doesn't exist
# run lynx to downlad
if [ ! -f "$log_file" ]
then

   for i in "${repo_list[@]}"
   do
      if lynx -dump -nolist "$i" | tee -a "$log_file" | grep "$search_term"
      then
         found="yes"   
         printf "\n%s found in $i\n" "$search_term"
      fi
   done
# if mepis_repos.txt exists, check it's been last updated
# if elder than x days, lynx dump the repos again
# else just grep through the file
else
   if   [ $(find . -name "$log_file"  -mtime +7) ]
   then
      rm "$log_file"
      for i in "${repo_list[@]}"
           do
                   if lynx -dump -nolist "$i" | tee -a "$log_file" | grep "$search_term"
                   then
                           found="yes"
                           printf "\n%s found in $i\n" "$search_term"
                   fi
           done
   else
      grep "$search_term" "$log_file"
   fi
fi



http://sprunge.us/TZhg

Will need more testing,
but thanks a lot, fsmithred.

Re: again a basch script question

Tue Nov 15, 2016 3:37 pm

Yeah, I got the same problem and came up with the same solution

But I don't understand why you need the script. Doesn't 'aptitude search <package>' work in mepis?
.

Re: again a basch script question

Tue Nov 15, 2016 3:53 pm

glad to hear you came up with the same solution.

As far aptitude question is concerned:
I guess so. I don't use it.
The script is for dragora, there are community repos, right now you can't search them with the packagemanager.
Those are ca 6 or 7. If i need a package, i check via Web if they exist. Go to 1 url, use firefox'es find, go to the next URL, use firefoxes find, etc. That get's very timeconsuming, if i do it, say, 5 times a day (in the same time i already packaged it myself ... ).

Why did i use Mepis community repos instead? I am not sure how much impact the server of dragora can handle. And, another reason, i thought here more people might be interested in debian-based URL's (i assumed aptitude would not be able to handle the community Mepis repos).
Post a reply