Discussion:
Cron and Curl
(too old to reply)
Mr. G
2015-04-13 18:41:02 UTC
Permalink
Cron and Curl don't play well together, on my machine at least.
I've a rexx script which uses curl to download a number of files and cron
to launch the script. In the beginning, all was well. Over time, It
started occassionally giving a 'can't resolve host' rc. Now, 'can't
resolve host' is 100% of the time.
Manually starting the script from a cmd line or program object works every
time and the script does its thing.
What I've tried:
Moving the time slot in cron = no good
Using a different cron :) = no good
Wget instead of curl = no good....even from a cmd line, a simple 'wget url'
connects, but the server never responds to the request. Options in .wgetrc
are tries and waitretry.

curl version 7.20.0 (v 7.20.1 reports missing 'kcrypt04.' which I presume
is an openssl dll and there is no such animal either in the 1.0.0 version
or the latest drop of last week or so.

url in question =
http://www.equibase.com/static/chart/pdf/index.html?SAP=HLN

Any ideas as to why curl can't resolve host when script is started from
cron, but when started manually, curl has no problems? Oh, and it is NOT
started detached in cron.

---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
Dave Yeo
2015-04-13 21:01:33 UTC
Permalink
Post by Mr. G
Any ideas as to why curl can't resolve host when script is started from
cron, but when started manually, curl has no problems? Oh, and it is NOT
started detached in cron.
The usual guess that it is a shared memory problem, or possibly cron
tries to use syslog to log itself and that screws up the tcpip stack.
Have you tried a different cron? Another option is that dragtext
includes a free WPS extension that adds a schedule option to the WPS so
you could use that to launch your script.
Dave
Mr. G
2015-04-14 07:05:04 UTC
Permalink
Post by Dave Yeo
Post by Mr. G
Any ideas as to why curl can't resolve host when script is started from
cron, but when started manually, curl has no problems? Oh, and it is NOT
started detached in cron.
The usual guess that it is a shared memory problem, or possibly cron
tries to use syslog to log itself and that screws up the tcpip stack.
Have you tried a different cron? Another option is that dragtext
includes a free WPS extension that adds a schedule option to the WPS so
you could use that to launch your script.
Dave
Yes, I did try a different cron. No difference
Tried dragtext (thanks for the suggestion); program is over my head. In &
reinstalled 3 times. Never could find a schedule option anywhere. Not where
help file pointed to nor anywhere else in the whole system. Weird, huh?

---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
Paul Ratcliffe
2015-04-14 09:43:17 UTC
Permalink
Post by Mr. G
Tried dragtext (thanks for the suggestion); program is over my head. In &
reinstalled 3 times. Never could find a schedule option anywhere. Not where
help file pointed to nor anywhere else in the whole system. Weird, huh?
It's object oriented. Try opening the Settings for an Executable
or Program Object and you should find a tab labelled Schedule.
(You did restart the WPS after installation didn't you?)
Mr. G
2015-04-14 17:48:43 UTC
Permalink
On Tue, 14 Apr 2015 09:43:17 UTC, Paul Ratcliffe
Post by Paul Ratcliffe
Post by Mr. G
Tried dragtext (thanks for the suggestion); program is over my head. In &
reinstalled 3 times. Never could find a schedule option anywhere. Not where
help file pointed to nor anywhere else in the whole system. Weird, huh?
It's object oriented. Try opening the Settings for an Executable
or Program Object and you should find a tab labelled Schedule.
(You did restart the WPS after installation didn't you?)
That's the kick-in-the-pants. I did look in settings for several exes and
program objects, no schedule tab.
Yes, after each install and uninstall, I did a full reboot not just a WPS
restart.
Aside from that, I could never use even just the schedule option of
dragtext for what it does to the mouse pointer.

---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
Paul Ratcliffe
2015-04-14 23:31:08 UTC
Permalink
Post by Mr. G
Post by Paul Ratcliffe
It's object oriented. Try opening the Settings for an Executable
or Program Object and you should find a tab labelled Schedule.
(You did restart the WPS after installation didn't you?)
That's the kick-in-the-pants. I did look in settings for several exes and
program objects, no schedule tab.
Did you check that the DTProgram class is installed/registered?
Post by Mr. G
Aside from that, I could never use even just the schedule option of
dragtext for what it does to the mouse pointer.
What? First you say the schedule option is not available, then you claim
you can't use it because it causes some other (unspecified) problem.

Sounds like you need to reinstall from scratch.
Mr. G
2015-04-15 05:39:20 UTC
Permalink
On Tue, 14 Apr 2015 23:31:08 UTC, Paul Ratcliffe
Post by Paul Ratcliffe
Post by Mr. G
Post by Paul Ratcliffe
It's object oriented. Try opening the Settings for an Executable
or Program Object and you should find a tab labelled Schedule.
(You did restart the WPS after installation didn't you?)
That's the kick-in-the-pants. I did look in settings for several exes and
program objects, no schedule tab.
Did you check that the DTProgram class is installed/registered?
Only half checked...dragtext says it's installed.
Post by Paul Ratcliffe
Post by Mr. G
Aside from that, I could never use even just the schedule option of
dragtext for what it does to the mouse pointer.
What? First you say the schedule option is not available, then you claim
you can't use it because it causes some other (unspecified) problem.
No, I said option is not available and wouldn't use it "IF" it was. AND I
never said problem. It is, a change in behavior. Prior to dragtext, action
was menu follow pointer. After dragtext, action reversed to pointer follow
menu (automatically). That action is unacceptable to me and annoying as
hell. This may or or not be the case on your or anyone elses machine. I can
only report what happens on mine.
Post by Paul Ratcliffe
Sounds like you need to reinstall from scratch.
---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
Dave Yeo
2015-04-15 02:51:26 UTC
Permalink
Post by Mr. G
Aside from that, I could never use even just the schedule option of
dragtext for what it does to the mouse pointer.
You can remove the shareware dragtext part and keep the scheduler (and
environment settings) part
Dave
Mr. G
2015-04-15 06:32:52 UTC
Permalink
Post by Dave Yeo
Post by Mr. G
Aside from that, I could never use even just the schedule option of
dragtext for what it does to the mouse pointer.
You can remove the shareware dragtext part and keep the scheduler (and
environment settings) part
Dave
With the reversed mouse pointer action, dragtext, or any part of it, is off
the table :(
unless there is a way I can keep pointer action as is, and that's a can of
worms I'd rather not open.



---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
Dave Yeo
2015-04-15 02:54:07 UTC
Permalink
Post by Mr. G
Wget instead of curl = no good....even from a cmd line, a simple 'wget url'
connects, but the server never responds to the request. Options in .wgetrc
are tries and waitretry.
Which port of wget did you test? There's a VACCP compiled one (Hobbes)
that might be worth testing. Same with Curl, there's a couple of builds
floating around.
Dave
Mr. G
2015-04-15 18:10:48 UTC
Permalink
Post by Dave Yeo
Post by Mr. G
Wget instead of curl = no good....even from a cmd line, a simple 'wget url'
connects, but the server never responds to the request. Options in .wgetrc
are tries and waitretry.
Which port of wget did you test? There's a VACCP compiled one (Hobbes)
that might be worth testing. Same with Curl, there's a couple of builds
floating around.
Dave
wget191-os2-bin-vac.zip is the one I used. Maybe should try an earlier
version ?
As for curl, the only two I know about are v7.20.0 (which I'm using) and
v7.20.1
which I just realised that it needs the specific version of openssl called
for as later
versions of openssl do not have a required dll. I think dll names changed
in v1.0.0

The perplexing thing is curl works flawlessly when the script is started
from cmd line
or program object. Only when the script is started from cron, does curl
barf.




---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
Dave Yeo
2015-04-16 00:49:11 UTC
Permalink
Post by Mr. G
wget191-os2-bin-vac.zip is the one I used. Maybe should try an earlier
version ?
As for curl, the only two I know about are v7.20.0 (which I'm using) and
v7.20.1
which I just realised that it needs the specific version of openssl called
for as later
versions of openssl do not have a required dll. I think dll names changed
in v1.0.0
I have version 7.28 of curl here, compiled by Paul (smedley.info) and
various copies of wget. The openssl DLL name depends on who compiled it
(different names to avoid sys2070's). IIRC kcrypt was compiled by KOMH
and should be on Hobbes.
Dave
Mr. G
2015-04-20 17:19:01 UTC
Permalink
Post by Dave Yeo
Post by Mr. G
wget191-os2-bin-vac.zip is the one I used. Maybe should try an earlier
version ?
As for curl, the only two I know about are v7.20.0 (which I'm using) and
v7.20.1
which I just realised that it needs the specific version of openssl called
for as later
versions of openssl do not have a required dll. I think dll names changed
in v1.0.0
I have version 7.28 of curl here, compiled by Paul (smedley.info) and
various copies of wget. The openssl DLL name depends on who compiled it
(different names to avoid sys2070's). IIRC kcrypt was compiled by KOMH
and should be on Hobbes.
Dave
Found Paul's curl version 7.36...still fails to resolve host when started
from cron

Seems server in question only responds to requests from a 'browser'
useragent, so I got wget working. Recoded the script for wget and it worked
from cron on Sunday, but this morning it was back to 'can't find host'.


---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
A.D. Fundum
2015-04-26 09:03:41 UTC
Permalink
Post by Mr. G
this morning it was back to 'can't find host'.
So you were using such a string. It works here, at least a few times
in a row. If it sometimes doesn't work, then you could also use
explicit waitretry-related parameters and, more important, some
Rexx-script to verify e.g. the size of a received file, if any.

You'll perhaps have to adjust the stragegy. Waiting longer often has
no use. A Rexx-based, conditional retry may be a better option than
assuming that it will always work.


--
Mr. G
2015-04-26 21:23:14 UTC
Permalink
Post by A.D. Fundum
Post by Mr. G
this morning it was back to 'can't find host'.
So you were using such a string. It works here, at least a few times
in a row. If it sometimes doesn't work, then you could also use
explicit waitretry-related parameters and, more important, some
Rexx-script to verify e.g. the size of a received file, if any.
You'll perhaps have to adjust the stragegy. Waiting longer often has
no use. A Rexx-based, conditional retry may be a better option than
assuming that it will always work.
I assume nothing! Every phase of the script has error routines
associated. If something goes wrong, an err.txt file appears on my
desktop, and it says what the problem is. A quick glance in the
morning tells me if the downloads were successful or if something
needs my attention.



---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
A.D. Fundum
2015-04-27 06:28:28 UTC
Permalink
Post by Mr. G
Post by A.D. Fundum
A Rexx-based, conditional retry may be a better option
than assuming that it will always work.
I assume nothing!
Nor do I, it's a general remark. You've reported at least two error
messages, which already was a clear indication that you don't always
assume that it'll always work.
Post by Mr. G
tells me if the downloads were successful or if something
needs my attention.
Based on the possibly inaccurate error messages (server down,
maintenance, DNS issues, you're blacklisted because of the number of
downloads, one of the services they are using is too slow, bad
connection, ...) you can try it again after a while before you look at
it, or you can try it again in the morning. That'll depend on the type
of data.

In general failures aren't that rare, and a disadvantage of a strategy
is that any change (e.g. a new lay-out) may influence the processes
severely.

In a specific case it deprends on the type (and quality) of the data.
If you can download the data in the morning too, then you don't have
to find out what all possible strategies have to be, and then you can
accept an occaisional failure. In my specific case I cannot download
the same data during the next morning, so the (results of the)
downloads are attended and verified.


--
A.D. Fundum
2015-04-20 15:23:47 UTC
Permalink
Post by Mr. G
Wget instead of curl = no good....even from a cmd line, a
simple 'wget url' connects, but the server never responds
to the request. Options in .wgetrc are tries and waitretry.
If it always fails, then a waitretry-setting isn't that relevant. If
you can and want to share it: what's the URL? There are more options
(/ parameters) to avoid possible problems, like:

--user-agent="Mozilla/5.0 (Windows NT 6.0; WOW64; rv:24.0)
Gecko/20100101 Firefox/24.0"

In scripts I'd use required parameters instead of .wgetrc. You can use
different or older strings. But I wouldn't use newer strings, to
promote support for both older OSes and FF24.


--
Mr. G
2015-04-20 23:31:48 UTC
Permalink
Post by A.D. Fundum
Post by Mr. G
Wget instead of curl = no good....even from a cmd line, a
simple 'wget url' connects, but the server never responds
to the request. Options in .wgetrc are tries and waitretry.
If it always fails, then a waitretry-setting isn't that relevant. If
you can and want to share it: what's the URL? There are more options
--user-agent="Mozilla/5.0 (Windows NT 6.0; WOW64; rv:24.0)
Gecko/20100101 Firefox/24.0"
In scripts I'd use required parameters instead of .wgetrc. You can use
different or older strings. But I wouldn't use newer strings, to
promote support for both older OSes and FF24.
Actually, the whole .wgetrc is not relevant and the 2 options in it don't
need to be in a script or cmd line for every usage I may find wget for. I
use it to permanently override default tries and waitretry just in case.
All other parameters are in the script.

The 'simple cmd' I started with was to figure out what I needed to make
wget work how I need it to, before changing the script. That has been
solved with the use of useragent and wget is now working.

I changed the script to use wget in place of curl and hoped for the best.
It worked one time, on Sunday. This morning, wget failed to find the host,
which is the problem I'm trying to get figured out.
When the script is run via cron, curl and now wget fail to resolve/find the
host. If I run the script directly from a cmd line or a program object,
both curl and wget connect right away and work flawlessly.

The url that fails to resolve host is
www.equibase.com/static/chart/pdf/index.html?SAP=HLN

Note: This is not a new setup. I've been using cron to run the same script
daily for the past 4 years without fail.
Just recently..... no go :-(



---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
Heikki Kekki
2015-04-21 17:10:25 UTC
Permalink
Post by Mr. G
I changed the script to use wget in place of curl and hoped for the best.
It worked one time, on Sunday. This morning, wget failed to find the host,
which is the problem I'm trying to get figured out.
When the script is run via cron, curl and now wget fail to resolve/find the
host. If I run the script directly from a cmd line or a program object,
both curl and wget connect right away and work flawlessly.
The url that fails to resolve host is
www.equibase.com/static/chart/pdf/index.html?SAP=HLN
Mark Eckstein's Task Planner (eClock shipped with eCS 1.1) works here
with above url and curl.
Now using eCS 2.1.
--
Hessu
Mr. G
2015-04-21 23:54:52 UTC
Permalink
Post by Heikki Kekki
Post by Mr. G
I changed the script to use wget in place of curl and hoped for the best.
It worked one time, on Sunday. This morning, wget failed to find the host,
which is the problem I'm trying to get figured out.
When the script is run via cron, curl and now wget fail to resolve/find the
host. If I run the script directly from a cmd line or a program object,
both curl and wget connect right away and work flawlessly.
The url that fails to resolve host is
www.equibase.com/static/chart/pdf/index.html?SAP=HLN
Mark Eckstein's Task Planner (eClock shipped with eCS 1.1) works here
with above url and curl.
Now using eCS 2.1.
Thanks for the tip. I no longer run eCS and it can't be used with plain 'ol
OS2.

---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
A.D. Fundum
2015-04-26 08:40:27 UTC
Permalink
Post by Mr. G
Post by A.D. Fundum
Post by Mr. G
Wget instead of curl = no good....even from a cmd line, a
simple 'wget url' connects, but the server never responds
to the request. Options in .wgetrc are tries and waitretry.
If it always fails, then a waitretry-setting isn't that relevant. If
you can and want to share it: what's the URL? There are more options
--user-agent="Mozilla/5.0 (Windows NT 6.0; WOW64; rv:24.0)
Gecko/20100101 Firefox/24.0"
In scripts I'd use required parameters instead of .wgetrc. You can use
different or older strings. But I wouldn't use newer strings, to
promote support for both older OSes and FF24.
Just recently..... no go :-(
Actually USE the setting I've recommended. Without it: failure. With
that --user-agent string: works. One line:

wget --user-agent="Mozilla/5.0 (Windows NT 6.0; WOW64; rv:24.0)" "
http://www.equibase.com/static/chart/pdf/index.html?SAP=HLN

Their software and/or settings, probably assuming the use of
Microsoft-products or rejecting download bots, is your problem. So
Wget has to pretend to be a Microsoft-based product, but another
string may work too.


--
Mr. G
2015-04-26 22:23:58 UTC
Permalink
------stuff snipped------
Post by A.D. Fundum
Post by Mr. G
Just recently..... no go :-(
Actually USE the setting I've recommended. Without it: failure. With
It doesn't matter what string is used for the site in question, only
that the string is identified as a 'browser'. Try it with a mozilla OS2
ver 17 to see what I mean.
Post by A.D. Fundum
wget --user-agent="Mozilla/5.0 (Windows NT 6.0; WOW64; rv:24.0)" "
http://www.equibase.com/static/chart/pdf/index.html?SAP=HLN
Their software and/or settings, probably assuming the use of
Microsoft-products or rejecting download bots, is your problem. So
Wget has to pretend to be a Microsoft-based product, but another
string may work too.
That is not the problem. Not resolving host means the DNS server
cannot convert the words www.etc into the decimal number notation
used to do the actual connection. Site settings and/or rejections don't
mean anything if one can't connect in the first place.

Yes, they loosely assume m$ products, but don't deny other OS's
browers.
When it comes to networking (internal or external), I'm probably one
of the dullest knives in the drawer. I never thought of searching
(DuckDuckGo) about the issue until this morning. I can be soooo
stupid sometimes. What I surmised from the search is, that it is not
a curl/wget + cron problem, but a DNS problem. Two possibilities;
the DNS server is non-operational at the time, or more probable,
the DNS cache is wrong. Why it works from a cmd line and doesn't
work when started from cron, is still beyond me.

Hopeful solution...bypass DNS and use the numerical notation directly
in the script. In my case,
url=199.115.24.29/static/chart/pdf/index.html?SAP=HLN
Tried it this morning, and all went well starting from cron.
I'll wait a week or so before thinking it's solved.



---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
Dave Yeo
2015-04-26 23:37:46 UTC
Permalink
Post by Mr. G
Hopeful solution...bypass DNS and use the numerical notation directly
in the script. In my case,
url=199.115.24.29/static/chart/pdf/index.html?SAP=HLN
Tried it this morning, and all went well starting from cron.
I'll wait a week or so before thinking it's solved.
You could also add the host:ip address to your hosts file. Danger is if
the site changes address, quite possible if large site. Another
possibility is to use a different DNS server
Dave
A.D. Fundum
2015-04-27 05:50:02 UTC
Permalink
Post by Dave Yeo
Post by Mr. G
Hopeful solution...
You could also add the host:ip address to your hosts file.
Using the IP address, part of a strategy, may reduce the number of
problems, and may introduce (rare) new problems. But downloads may
still fail. In the past I've downloaded data of "Wall Street"
(nyse.com). Attended. Regardless of Wget's settings there were upto 16
retries by Rexx, based on missing files. One of the problems during a
bad day was that about 75% of all downloads could fail (regardless of
any right or wrong error message, I'm using -q). So I received 300 of
400 files, 75 of the remaing 100 files after the first retry, and so
on.

Please note, IIRC, that the first error message wasn't that clear. It
wasn't "Unsupported client or operating system version, so the server
of our host won't serve you.". Using an IP address wouldn't have
solved the first reported issue. BTW, I've tried using Rexx'
FtpPing(), but that didn't really help in my case.

If you're using a script indeed, then you can e.g. also check for a
reasonable size and if the file does contain expected data. Wget's
options (any reasonable version) are pretty good, but if their server
is down then quickly trying it 100 times perhaps has no use. A
customized, optimized data quality stategy isn't required in case of
nice-to-have data, but perhaps such a strategy is required in case of
must-have data.

General Wget options I'm using are -q (quiet), -O (name of outpyut
file) and often an user-agent string (using an OS/2 or Windows string,
and hopefully a browser version available for OS/2).


--
Mr. G
2015-05-01 20:27:48 UTC
Permalink
Post by A.D. Fundum
Post by Dave Yeo
Post by Mr. G
Hopeful solution...
You could also add the host:ip address to your hosts file.
Using the IP address, part of a strategy, may reduce the number of
problems, and may introduce (rare) new problems. But downloads may
still fail.
The only potential problem I can see, is if a site changes their IP adress,
then the hosts file or script will not get updated automatically. No big
deal.
If that were to happen, my script would not find any files to download from
the index.html, would raise an error stating such, and to check the site
for a problem on their end.
If you know of other possible problems, please elaborate.
Post by A.D. Fundum
If you're using a script indeed, then you can e.g. also check for a
reasonable size and if the file does contain expected data. Wget's
options (any reasonable version) are pretty good, but if their server
is down then quickly trying it 100 times perhaps has no use. A
customized, optimized data quality stategy isn't required in case of
nice-to-have data, but perhaps such a strategy is required in case of
must-have data.
Does size really matter? <g> Jokes behind us, the files I get are under
300KB with most under 200KB and take no more than .02 of a second
to download. After the download cycle completes, the script first checks
for the existence of the files in the list, then opens each file and checks
it for validity. There is one day out of the year when there will be no
files.
That contingency is also written into the script.

As you can see, this is must-have data, and I've got the bases covered.
I think, because the files are so small, is the reason I've never had a
failed
file download to date. (Now that I've said it, I'll have one tomorrow)

Note: I said file download, not failure due to connection or downed server,
etc.

"quickly trying it 100 times".....that's what the wait-retry option is for.


---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
Mr. G
2015-04-28 19:15:39 UTC
Permalink
Post by Dave Yeo
Post by Mr. G
Hopeful solution...bypass DNS and use the numerical notation directly
in the script. In my case,
url=199.115.24.29/static/chart/pdf/index.html?SAP=HLN
Tried it this morning, and all went well starting from cron.
I'll wait a week or so before thinking it's solved.
You could also add the host:ip address to your hosts file. Danger is if
the site changes address, quite possible if large site. Another
possibility is to use a different DNS server
Dave
That's an idea. Not too likely the site will change their address. It's
been the same for at least the past 15 years. I don't know if it could be
considered a large site (many retail sites much larger), but it is the
official
data provider for the industry with many entities paying a subscription for
direct access to their database in addition to what they offer on their web
pages for free or for fee.

Sans router, would the address of the DNS server be put in the RESOLV
file? What's the resolv2 file for?

---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
Dave Yeo
2015-04-28 23:59:45 UTC
Permalink
Post by Mr. G
Sans router, would the address of the DNS server be put in the RESOLV
file? What's the resolv2 file for?
I forget which is which but IIRC one is for connecting through LAN
(ethernet) and one is for connecting with dial-up. Safest is to update both.
Dave
Mr. G
2015-05-01 20:28:54 UTC
Permalink
Post by Dave Yeo
Post by Mr. G
Sans router, would the address of the DNS server be put in the RESOLV
file? What's the resolv2 file for?
I forget which is which but IIRC one is for connecting through LAN
(ethernet) and one is for connecting with dial-up. Safest is to update both.
Dave
Makes sense, thanks.

---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com
A.D. Fundum
2015-05-02 13:47:19 UTC
Permalink
If someone can be bothered, then these efforts may be useles. If I use
mtr-0-51-bin.zip's MTR.CMD and the IP address in question, then you
may experience this pretty soon:

UNKNOWNHOST Sat May 2
15:43:52 2015
Keys: D - Display mode R - Restart statistics Q - Quit
Packets Pings
Hostname %Loss Rcv Snt Last Best
Avg Worst
1. mygateway1.ar7 0% 7 7 30 30
30 30
2. 90 0% 7 7 40 30
32 40
3. 90-145-146-141-ams-unet-nik-cr02.ne 0% 7 7 30 30
30 30
4. ams 0% 7 7 30 30
32 40
5. ams-unet-nik-br12-te-0-0-0-0.unet.n 0% 7 7 30 30
30 30
6. ???
7. ae-3-3503.edge4.Chicago3.Level3.net 15% 6 7 130 120
130 150
8. 4.68.71.102 0% 7 7 120 120
130 150
9. lex1-ar2-ae20-0.us.twtelecom.net 0% 7 7 130 120
138 190
10. 216.84.105.34 0% 7 7 130 120
127 130
11. ???
12. equibase.com 0% 6 6 160 130
153 160
Resolver: Received error response 2. (server failure)


--
Mr. G
2015-05-02 20:24:18 UTC
Permalink
Post by A.D. Fundum
If someone can be bothered, then these efforts may be useles. If I use
mtr-0-51-bin.zip's MTR.CMD and the IP address in question, then you
UNKNOWNHOST Sat May 2
15:43:52 2015
12. equibase.com 0% 6 6 160 130
153 160
Resolver: Received error response 2. (server failure)
It appears to me from the term "Resolver" that MTR is still going through
a DNS server. I decided to use the numerical notation in my script and
bypass DNS altogether whose sole purpose is to translate alpha-numeric
names to numeric notation and pass that on, which is how the internet
connects. Putting it in a hosts file should be the same.

As stated previously, I already have a routine in place for "server
failure"
whatever the reason. On top of that, you ran your test on one of the two
busiest days of the year for the site, so a server overload is to be
expected
at times.

If a server is down, my script will catch it and react according to the way
I want it to, not necessarily the best way, or how someone else would
do it, but "my" way. Frank Sinatra anyone?

All suggestions are evaluated, most are tried, and the one that fits me
best
is decided on, especially when there can be multiple solutions. When there
is only one solution, I don't have a choice, and have to go with that.

I do appreciate and thank you for your input.


---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com

Loading...