|
The FuzzyBlog! Marketing 101. Consulting 101. PHP Consulting. Random geeky stuff. I Blog Therefore I Am.
|
Tuesday, May 14, 2002 |
Damn It! My Time is Valuable Too
Traditionally the time of a programmer has been considered very valuable and rightfully so. Programmers are highly trained, often brilliant individuals. But there are now millions of programmers and, while you are still hugely valuable, you can no longer be the prima-donnas of yore. It's now 2002 and my time is also valuable. I hope that every programmer reads this. That's totally unrealistic so I hope that the programmers I work with read this.
==>"I Can't Test My Own Code" <==
1:38:32 PM Google It!
IM Me About This
|
|
And the Winner (So Far) in the FTP Race is ...
Well, the FTP race continues. The best thing so far is FlashFXP. However, I have major reservations. The product claims to do what I need. But it doesn't seem to actually do it. When I follow the tutorial and get it so that 2 ftp sites are displayed concurrently and try and transfer them, no files move. I was able to upload locally to my destination server so I do know that bytes will move. It's also what I call "a well dressed but deceiving" product. The GUI is really nice but it leads to lots of usability errors. I understand that the people that make this are, whether or not it works for me, definitely lesser demigods of the FTP world. It has, I think, every possible option. But then you try and actually use it and it naturally leads you the wrong way. For example, it lets you drag and drop between locations but then doesn't do anything (guys, there are options to turn this stuff off right in the VC property sheets on your widgets). It will let me connect to Site B while still connected to Site A but then when I do a transfer files got to Site A (when moving from local disc). The command lines for setting the path I move from can be edited when the source or destination is Unix but not on Windows (so if I want to move from C:, its drop down the list, move up multiple times and press enter). My general feeling is that they put its best dress on but never bothered to give it a shower before sending it to the dance.
I was going to link to the product's tutorial showing how the site to site transfers work but they stupidly do some kind of whacky dhtml or frame stuff so I don't have a clue what the url is. Bizarre and utterly, completely stupid (IMHO) for this type of content. If I can't find it then neither can Google. Update: Here it is.
I'm going to email a friend who writes the best FTP crawler I've seen yet and ask him. I checked out his code Sunday night when he was debugging some interesting issues with MS FTP servers and it, well, rocks.
10:19:32 AM Google It!
IM Me About This
|
|
Not to Begin the Day with a Rant... (Yeah Right!)
Ok. A cool guy just IM'd me with FTP suggestions: "tried CuteFTP or aceftp?" Cool. This is just awesome. I ask a question and 2 hours later someone smarter than me in Norway pops up with good ideas. Here's the problem. CuteFTP which seems to do what I want is, ahem, offline. I wouldn't mind this if they didn't also SELL an ftp server. Here's the screen cap:
I tried downloading from here and from www.download.com (which just redirects to here). Now with 11 million+ downloads they must be doing something right but I have about 0% confidence right now in them. And I'll probably never go back. I suffer from "IIADD" (Internet Induced Attention Deficit Disorder).
But Seriously, this Stuff is Easy to Monitor For
The interesting thing is that monitoring of resources like this is unbelievably simple to implement. I've done it in Perl and in PHP in just a couple of hours. I ran this by the person who forwarded it to me and he said that he does it for his application (a blogging tool) too which SMS'es his phone when a serious error happens. I asked Lawrence at UserLand about this and he confirmed that they do it too. So, once again, it's the little guy that cares about quality that does it right. I find it embarassing when my code breaks. So does he. Who gets embarassed at GlobalScape I wonder?
6:13:35 AM Google It!
IM Me About This
|
|
FTP or So This is What the Other Side Looks Like
Hmmmm. First post ever before 4 am. Yes, I did go to sleep, I just cycled really, really fast. As part of my "You will not use FrontPage, You will not use FrontPage" aversion therapy, it has been recommended that I begin using a normal FTP client. Here are some experiences and comments. I'd love some feedback. I have both Linux and Win2K boxes that I use regularly and my FTP destination is a Linux box. My problem is a) Need to FTP > 4,000 files from a BSD host that doesn't have shell access (otherwise I'd use SCP) to a new Linux server and then daily "I built this, gotta upload it" type use. Smart people I respect keep telling me "Dummy! Use CVS to publish your website". I don't doubt they are right but being able to move files 1 by 1 and by groups is, to me, the abc's. Until I'm really, really comfortable with that, I'll just feel silly. And, I'm fine with mget, mput, mode, prompt, etc.
Here's my real problem -- all of these GUI ftp clients feel "wrong". I mean, come on everyone. Let's illustrate this:
- I bought FTP Voyager since it's worked well for me in the past. Now it's a Windows XP style GUI. The flair on the icons makes them hard to read. Also they don't support cut, copy and paste from Windows Explorer, only drag and drop.
- I downloaded WS_FTP. It's _interesting_. Never saw a GUI quite like that.
- Nothing can do server to server. Yes, I know that there is FXP for this but, damn it, if you can FTP from there to me then why can't you give me the abstraction of server to server -- i.e. FTP from there to local memory and then to remote FTP. I can't be the first person who ever switched hosting companies (and why don't I want to move local first ? a] seems silly and b] different file system naming rules between 2K and Linux so I'm not convinced that in my 4,000+ files I won't have an error and never know it)
And let's ask ourselves about command line FTP. Yes I can now shell access into my new box. So I can then do command line FTP and then an mget from remote to local. First I had to rediscover prompt to turn off the 1 by 1 prompting. A quick bit of googling gave me a University of Wisconsin??? (can't remember, sorry) internal help page that, as is the norm, is better docs than anything else. So I moved over 1 directory really, really fast. That just left 23 subdirectories off that directory entry. I see from the docs that mget is just a globbed version of get so I don't really see how there could be a recursive flag but I'm wishing and hoping for one. Standard GNU style help gave me the usual "You really should just look at the source you know" type experience. Anyone know how to do this?
Disclaimer: I offered to write examples for the GNU help system about a year ago and got either a "No Thank You" or just nothing at all (I offered to two projects and I can't remember which was which). I'm still a little frustrated over that. Calling all programmers: Examples are good!
And I have an iMac too if that makes this better (pre-OS-X).
Any suggestions for the "FTP Challenged"?
4:07:15 AM Google It!
IM Me About This
|
|
|
FontFixer!
See Me Speak at this Conference!
I Might Speak at this Conference!
Contact Info:
"FontSafe" Blogs Resizable Text
|
|