I’m going to de-clutter my desktop and pass on the 17” MBP and the (classic!) 23” Cinema Display to the other two members of the family. I no longer need a mobile solution to carry everything I do with me, and there are better ways to do things on the move nowadays, than carrying a great-but-bulky laptop around.
I stumbled upon that new world last year, and decided to really get into it. In a distant past, long before the PC was invented, I used to build (and blow up) amplifiers, then simple digital circuits, then crude microprocessors - so it’s really all a fascinating bridge back to the past. I’ve been interested in getting to grips with energy consumption around the house for quite some time now, and all of a sudden there is this technology which makes it possible, affordable, and fun!
Lots of people seem to be getting into this now. It’s not likely that someone will come up with something as advanced as an iPhone - but what’s so amazing is that the technology is essentially the same. The playing field is leveling out to an amazing degree, because now just about anyone can get into exploring, designing, and developing hardware (and the firmware / software that makes it do things, which is usually the bigger challenge).
Anyway, in an attempt to create a structure for myself, I’ve set up the Jee Lab - a weblog and a physical area in my office to explore and learn more about this new world. If you’re interested, you’re welcome to track my progress there. It’s all open - open source, open hardware, open hype? ... whatever.
This weblog is the spot where I will continue to post all other opinions, ideas, and things of interest - as well as news regarding the software projects which remain as near and dear to me as ever: Metakit, Tclkit, Vlerq, and more.
Well... DTPO 2.0b + S510M are P H E N O M E N A L when used together.
It scans all my paper, from doodles to invoices to books. It converts text and inserts it invisibly into the PDF with all text indexed / searchable (and the big news in 2.0b3 is that the resulting PDF sizes are excellent). And now the biggie: DT can take these documents and auto-categorize them into different folders which already contain a few example documents.
So the workflow is: insert paper, push button, insert paper, push button, etc. Then as each OCR completes: enter some title (I just enter the main name / keyword, duplicate titles are fine). Finally, auto-categorize all documents in the inbox, and voilá; everything has been filed for eternity.
You can create “replicas” in DT to place a document in multiple folders, very much like a Unix hard link.
There’s a scriptlet which can be installed in Safari as bookmark, and since I’ve added it as 3rd item on my bookmarks bar, CMD+3 creates a web archive of the current page in DT (even with the bookmark bar hidden). There are also Dashboard widgets.
Again: auto-categorize puts these captured pages in a folder with documents most like it. And of course all documents can be found regardless of how they are organized, by entering a few characters in DT’s search box.
Did I mention how unbelievably effective this all is? Oh, yeah, I did ;)
PS. Other recent discovery I’ve started using heavily is DropBox. Syncing done right (uses Amazon’s S3). Bonus feature is automatic photo galleries, such as this one.
But they managed to mess up in awful ways:
- plugs don’t have a pushbutton, so turning a plug back on requires going to the controlling PC
- the system comes with a CD, which then insists on an internet connection and installs over the net
- you have to register with a code printed on that same CD (not the box, the actual disk)
- you must enter personal info, at least an email address anyway
- by default, a checkbox is set which grants permission to PW to obtain your power usage data
- and then... you’re told that you’ll get an email unlocking the “source” application
- (calling an application “source”, when it is anything but, is somewhat confusing for IT people)
- without that activation, the software barfs and exits
I’m not amused. This is hardware after all, so software protection all the way to obtaining my email address and insisting on internet connectivity and over-the-net activation is totally over the top, if you ask me. I’m happy to have these plugs now to monitor exact power consumption around the house and I sure intend to use them fully - but this approach is stupid.
These plugs made me wiser indeed, but probably not quite in the way the manufacturer intended...
Update: just got the confirmation email. Ok, so it took 15 minutes... my point is that this shouldn’t have been required in the first place.
Thanks to everyone who reported this - I’ve been out of the country (and offline) for a few days, hence the delay in fixing this.
On the left: 1.6 GHz Atom, 512 Mb RAM, 8 Gb SSD, 1024x600.
On the right: 25 MHz 486SX, 10 Mb RAM, 200 Mb HD, 640x480.
Here’s another size comparison, with a 17” MacBook Pro (Core 2 Duo) this time:
Mac: cost 10x, CPU speed 4x. Tosh & Mac both weigh 3x the A110L netbook.
SquirrelFish Extreme: 943.3 ms, V8: 1280.6 ms, TraceMonkey: 1464.6 ms
Given the fact that JS is part of just about every web browser by now, it looks like this modern dynamic, ehm... scripting language is becoming seriously mainstream.
More seriously, I really subscribe to the idea that "if the only tool you have is a hammer, you treat everything like a nail". So, programmers should learn several languages and learn how to use the strengths of each one effectively. It is no use to learn several languages if you do not respect their differences.
That last sentence says it all. It implies learning several languages well - not just skimming them to pick on some perceived flaw. All major programming languages are trade-offs, and more often than not incredibly well thought-out. To put it differently: if you can’t find an aspect of language X at which it is substantially better than what you’re using most of the time, then you haven’t understood X.
The 1.0rc1 release of Wine is very easy to install (I did a full build from source, as described here, which builds a dmg), with the usual "drag the app (a folder in this case) to the Applications folder". Everything appears to be neatly tucked away in a 67 Mb app bundle. I set up MacOSX so that all files with extension ".exe" launch WineHelper. And that's it - exe files become double-clickable and they... work. Pretty amazing stuff.
Behind the scenes, Darwine launches X11 for the gui, and opens two more windows: a process list and a console log.
I tried Pat Thoyt's Tclkit 8.4 and 8.5 builds and they work - Tk and all. There is a quirk with both of them that stdout, stderr, and the prompt end up on the console, but that's about it. Apps run on drive Z: which is the Mac file system, so everything is available.
This is fascinating. Maybe the time has come to try running MSVC6 straight from MacOSX. It would be great to automate builds from a normal command-line environment, running only a few apps on Win32 (i.e. msvc6's "cl.exe") and staying with TextMate, bash, make, as usual.
I just moved my Vista VM off to a secondary disk, reclaiming over 10 Gb of disk space filled with something that taxes my patience and strains the Mac (1 Gb RAM allotted to VMware and it still crawls). Having Win98, WIn2k, and WinXP VM's is enough of a hassle: I just tried upgrading XP to SP3 and it failed because 700 Mb of free space wasn't enough. Why didn't a quick disk check before the install warn me? Actually, the real question is: why is a 4 Gb virtual disk too small to comfortably run XP?
Darwine & Wine really deserve to succeed IMO. So that I can treat Windows as a legacy OS. And get back to fun stuff.
As I said, the result is perfect - and music is fun again, even at a very low volume.
Which makes this post hilarious ;)
Let me explain. I've been moving more and more information to disk lately, scanning in books and de-duplicating all cdrom/dvd backups (we had all our vinyl and negatives digitized a while back). Takes only 250 Gb so far, not counting mp3's/m4v's that is (which have less uniqueness value to me). See this NYT article - where Brewster Kahle summarizes it well: "Paper is no longer the master copy; the digital version is".
But with bits, particularly if some of 'em are in compressed files, data integrity is a huge issue. Which is where RAID makes sense. So I got two big SATA disks, hooked them up to trusty old "teevie", a 6 year old beige box stashed away well out of sight and hearing. With RAID 1 mirroring, everything gets written to both disks, so if either one fails: no sweat.
But RAID does not guard against fire or theft or "rm -r". It's a redundancy solution, not a backup mechanism. I want to keep an extra copy around somewhere else, just in case. Remote storage is still more expensive than yet another disk, and then you need encryption to prevent unauthorized access. Hmm, I prefer simple, as in: avoid adding complexity.
So now I'm setting up RAID to mirror over three disks. The idea being that you can run RAID 1 just fine in "degraded" mode as long as there are still 2 working disks in the array. Once in a while I will plug in the third disk, let the system automatically bring it in sync with the rest, and then take it out and move it off-site again.
But the story does not end here. Bringing a disc up to date is the same as adding a new disk: RAID will do a full copy, taking many hours when the disks are half a terabyte each. Which is where the "write-intent bitmap" comes in: it tracks the blocks which have not been fully synced to all disks yet. Whenever a block is known to be in sync everywhere, its bit is cleared. What this means is that I can now put all three disks in, let them do their thing, and after a while the bitmap will be all clear. Once I pull out the third disk, bits will start accumulating as changes are written to the other two. Later, when putting the third disk back on-line, the system will automatically save only the changed blocks. No need to remember any commands, start anything, just put it on-line. Quick and easy!
If a disk fails: buy a new one, replace it, done. Every few months, I'll briefly insert the third disk and then safely store it off-site again. If I were to ever mess up really badly (e.g. "rm -r"), I can revert via the third one: mark the two main disks as failed, and put the third one in for recovery to its older version.
Methinks it's the perfect setup for my needs. Total cost under €300 for the disks plus cheapo drive bays. Welcome to the digitalization decade.
See http://www.wettershop.de/ under Designprodukte/Thermometer. It sticks on the outside of a window, BTW.
Well, times have changed. TM is revolutionary, because its overhead is proportional to the amount of change. Every hour, my (quiet) external HD spins up, rattles for a few seconds, and then spins down again after 5 minutes (not 10, see "man pmset"). All you need is a disk with say 2..3x as much free space as the backed-up area. One detail to take care of is to not include big changing files such as VMware/Parallels disk images and large active databases in the backup, because TM backs up per file. So I put all those inside /Users/Shared and exclude that entire area.
Oh, and TM does the right thing: it suspends and resumes if its external disk is off-line for a while, such as with a laptop on the road.
Note that attached disks cannot protect from malicious software and major disasters such as a fire - but this is something an occasional swap with an off-site disk can take care of.
I no longer "do" backups. I plugged external disks into the two main Macs here and TM automatically asked for permission to use them. End of story.
VMware has a quick setup whereby it asks your name and a password and then lets the whole setup run with no further questions asked. After the whole install process, the system comes up logged in and ready to go - which is great.
Except that I did not enter a password, which leads to a system which does an auto-login with a password I do not know. It's definitely not the empty string. I don't even know who is to blame, VMware or Vista... After trying a few things and googling a bit, I was about to conclude that a full re-install would be my only option. No info on special boot keys to bypass/reset things, the built-in help says you need to know the admin password (or have a rescue disk, which... needs an admin password to be created). It all makes sense of course, but I wasn't getting anywhere with all this and left with a setup I couldn't administer.
But guess what: as admin, you can create a new user with admin rights, and it doesn't ask for your password! So I created a temporary user with full admin rights and with a known password, and switched to it. Then I reset the original admin's password - bingo. Did this after all the latest updates were applied, btw.
In other words: anyone can do anything on a machine running Vista if the current user is an administrator, without ever having to re-confirm the knowledge of that admin's password: simply create another admin and switch to it.
Pinch me. How many years has Vista been in development? How long has it been out as official release? On how many systems has it been installed?
Haven't used it - beyond a quick tryout. JD only caches files up to a (configurable) limit, the rest gets pulled from S3. Also, it looks like offline mode is not supported (yet, apparently) - though setting the cache big enough to always hold everything might do the trick.
Fortunately, the other 7,857,999 are ok :)
I can't help but think that the Tk browser plugin could have filled this spot ages ago. And much more, by now.
The news is that Flex will be open sourced before the end of the year. Think about it: open source code, written for a presentation engine that is fully controlled by Adobe. Clever. It's not easy to come up with scenario's whereby you get to lock-in open source developments.
Update: another player in this field is Microsoft with Silverlight.
It came with a MacBook around it... super cool!
The drive was getting terribly hot, so I took it out of the 3.5" drive slot to get some more free-air cooling. Problems went away long enough to recover all partitions with a couple of restarts (and many hours of patience). Thank you Knoppix and rsync, for being there when I needed you.
As it so happens today a big new external HD arrived (Maxtor OneTouch III), so I have been busy making full backups of all the main machines around here. No fun, but I guess I got away with the occasional when-I-think-of-it full backup style I've been doing for years now. It's not the work lost that I fear (I really do backup my active files a lot), but the amount of time it takes to restore a well-running system after serious hardware failures. Copying these big disks takes forever.
The failing IBM Deskstar 120 Gb drive worked for about 5 years without a hitch, so it really did well. Luckily, there's a spare 80 Gb around here which can take its place - but it sure takes a lot of time to shove those gigs around and get all the settings just right again!
When I have time for it, am having way too much fun with Vlerq right now!
I love that second smile effect. This is what creates a truly loyal customer base.