Showing posts with label tech. Show all posts
Showing posts with label tech. Show all posts

Friday, June 12, 2009

I'm In The Money?

At least, potentially, according to this New Scientist article.

It was actually never my plan to be a Data Scientist. And I don't really know anything about statistics, so I'm not quite there. I think it's probably right, though. We're at the point where we have tons of data, but nobody understand it.

I miss the '90s.

Friday, March 13, 2009

Convergence

A couple of things caught my eye this week.

First, an "instant on" internet-based OS. I think it's this one, but I can't really recall. The premise is, you start your machine and within ten seconds you have 'net access.

The concept here is that your "real" OS boots in the background. But very often, why bother? And, what if you just want 'net access to get to the second item?

An Internet-based operating system! Now, whether it's really an OS or not is a matter of some debate amongst the geeks, but the larger picture is "who cares?" Have access to all your stuff in ten seconds? Who's going to need much personal power, except for the specialty niches, like gaming?

Now, take this keyboard emitter and wrap it in an OLED, stuff the whole thing in a cute little white quiver with a picture of a blue apple on it, and you have yourself the device of the future.

I first postulated the hardware end of this in the '90s and, eventually, you'll be able to pack that keyboard emitter and/or OLED with enough power that you don't need to run an internet OS on it. But I could see some combination like this being the netbooks of the future.

Thursday, January 22, 2009

Microsoft Is Dooooooomed!

Actually, despite the seeming sarcasm, I do believe this. Microsoft is "doomed" (for some value of "doomed").

I've seen it coming for a while. I was in the trenches during the O/S wars (both in the '80s and the '90s) and one thing was apparent: Microsoft's power came from marketing--their monopoly (or near monopoly)--and not from technology.

People talk like Vista is a novelty, but Microsoft has a long history of releasing products that are so bad, that if a company without a monopoly had released them, they'd go out of business. In fact, most of Microsoft's competitors were killed by mis-steps much smaller than Vista.

Consider how many tries at Word, Excel, Access, Internet Explorer, DOS, OS/2 and Windows were needed to gain any kind of traction in the marketplace. Consider that OS/2 needed to be removed from their hands before it could take off, and they were so scared of it--a product competing on actual technical merit!--that they hired people to go on the web and lie about it. Consider that the Xbox is doing okay as long as "doing okay" doesn't need to include ever making a profit.

Consider how much trouble Microsoft created to interfere with their competitors--at the expense of their putative customers.

For that matter, consider who is responsible for unleashing upon the 'net an OS that could be so easily enslaved--and was by default, easily enslavable--that now millions of zombie machines churn away sending spam, orchestrating denial-of-service attacks, and generally marring the greatest innovation of the computer era (the Internet).

And why? Because it's way easier to collect the money for a monopoly than it is to support the people you force to pay you. Microsoft wanted the cash from all those people wanting to e-mail photographs and buy stuff from Amazon, but they sure as heck didn't want the responsibility.

Which is interesting, in its own way, because if they had--if they had taken it upon themselves, they'd have a nigh-indestructible brand. An entire generation of customers would love them to their graves.

When you have a monopoly, all you have to do is out-wait your competitors. And when you're in the magical position of your competitors being your customers, well, you can drive up their costs in all kinds of creative ways, like promoting standards and forcing them to invest in them, and then later dropping them. You can be a "partner" and then steal their code. If you can't steal their code, you can take their employees. Ideally you can do both.

You can drop your prices because you don't need to make money on any given product. In fact, in preserving the monopoly, you don't have to charge for anything if it secures that monopoly. (That's certainly the motivation behind the Xbox.) Once the competition is out of the way, you can stop putting money in to that product.

And life is good--as long as you can maintain the monopoly.

In tech, though, you can't. You can suppress new technologies for quite some time. But not forever. And if your strategy involves destroying other businesses by depriving them of money, you're in trouble if products arise to compete with yours from business models that aren't dependent on revenue from products.

And here we are. Why should anyone pay for an operating system when perfectly good ones are available for free? Why should anyone pay for office software, when perfectly good ones are available for free, and give you more freedom?

IBM was in a similar position 25 years ago, except their competition came from hardware getting cheaper. And ultimately their hard-earned monopoly--way harder earned than MS's, which started with IBM giving them barrels of money--crumbled and they had to reposition themselves as consultants.

Make no mistake, the monopoly money will keep pouring in. These difficult economic times, however, are going to have people looking sooner rather than later at "the Microsoft tax", and increasingly low-end hardware like netbooks are going to make the "free" in "free software" more appealing.

Eventually, MS is going to have to retreat to the niches it once assigned its "partners".

Tuesday, December 16, 2008

Legacy of the Panned

Went to see Moscow, Belgium today with The Boy, who I think can probably claim to be the only 13-year-old male in America to see it. (Not many turning out to see movies about 43-year-old women juggling raising their children with their alienated husband and a truck driver competing for her affections. In Dutch. Or so I'm guessing.)

So I owe you two. Consider the following in the meantime however.

Because showing feature films (since about 1950) on a 4:3 TV would leave bands of the TV black along the top and bottom (and a very small resultant picture in many cases), the whole technology of "pan-and-scan" was developed, where a 16:9 (or other) film was reframed as 4:3, roughly along the center but panning to the right and left "as needed" to convey certain film elements. (I swear Blake Edwards used to deliberately frame dialogs with the two characters at the extreme ends of the frame deliberately to mess with that.)

So this butchery was allowed to continue, and few even commented on it until the '80s. As a result, pan-and-scan is still the dominant way films are shown on TV.

But wait, the widescreen TV is pretty standard these days! Does that mean they're showing the films as originally shot and framed? In a few cases, yes.

In most cases, however, the pan-and-scan version is being shown and then blown up to fill the edges of the widescreen TV.

So, you're seeing a butchered version of a film, where everyone looks short 'n' fat to boot. And while you can override this in some cases, I've seen a few situations where the cable overrides the TV controls, locks in the stretch, and seems to refuse to allow the picture to at least be put in the 4:3 frame for which it was designed.

Reminds me of the fact that lines in text files are still largely delimited by carriage-return followed by a line feed, from the time when they were printed out on teletypes, and the print head was on a carriage that had to be moved all the way to the left, and then the paper scrolled, in order to keep the line of text on the page and not overlapping.

Technology's funny, isn't it? Butchered movies with short fatties--not so much.

Sunday, August 3, 2008

Ouch & Whoa!

The Sitemeter fiasco yesterday cost me about 20% of my traffic. If you missed it, Sitemeter made some change to their code that didn't agree with IE7.

Actually, even though the error was caused by IE7's famous non-standard-ness, IE7 overall is not that bad; but I cleaned up so many messes from IE6 and earlier versions, I'd never, ever, ever recommend anyone use a Microsoft browser.

Some of you might be interested in the latest Firefox or Opera, which are both free. I use Firefox more, but Opera has been a staple of mine for a decade. (I use them in different ways. I tend to keep Opera up for heavy research and Firefox for more casual day-to-day browsing.)

Meanwhile, the usual pointy-breasted and Traci Lords searchers are being diluted with Treadmill Desk mania! That's kind of cool.

We'll see how things proceed.

Wednesday, February 20, 2008

Randal Schwarz Talks Squeak/EToys on the XO

Randal Schwarz, Perl expert/Smalltalk convert, talks about Etoys running on the XO (the One Laptop Per Childmachine) over at Lab With Leo.

Randal runs through the moving a car/steering a car warhorse--which, if you've seen a demo of Etoys, ever, that's the one you've seen--but this is a very good quality video and sound, and a nice energy.

Emphasis is on tech, naturally, so it probably won't help to stem the tide of "What do third world kids need computers for?" challenges, but he does mention that you can steer a car on someone else's computer. That I'd like to see!

Tuesday, February 19, 2008

Blake's First Rule Of Prognostication

I used to be a pundit, of sorts, of technical matters. I wrote an back-page editorial for a German magazine for years in which I talked about various issues and how I saw the tech world unfolding.

I was very bad at--and soon stopped attempting to--see the future. However, I have observed that some people can make lots of money and press by pretending to see the future. And I see what my major error was. I was always trying to predict details about what was right around the corner. What was going to happen in the next six, twelve or eighteen months. You can't do that, of course, because it makes it too easy to check on your results.

If you want to be a successful fortune teller (and get the money and PR for it), you have to predict the big things that are going to happen, and they should be about 20 years out. Seven years is bold, but still pretty safe, and 100 years is too far to care.

So, my first rule of prognostication is this: Successful fortune-telling requires seeing things far enough into the future that you can avoid being checked on.

Indeed, most of the environmental "terrors" I've seen in my life have been predicted twenty years out. In the 1990s, we were supposed to be out of raw materials, most especially oil. There were supposed to be 20 billion people on the planet. And the new ice age was to be upon us.

The exception was nuclear winter, which required a hypothetical situation that virtually guaranteed no one would be able to check the results.

Heh. I got so to ranting I forgot what made me think this. (Via Instapundit.)

Human-equivlaent AI predicted by 2029 [link to BBC article].

We needed a new AI prediction because the last time I recall such a prediction was 1989. Now that everyone's forgotten that one, we can move on to the new prediction.

There is a segment of Christians, I'm sure, that has been predicting the Second Coming, every generation 100 times over. My dad used to do some (data processing) work for Morris Cerullo, who was predicting it in the early '80s.

But some would have you believe that only the religious believe in false prophets.

Friday, February 15, 2008

Ink!

PopSci is bitching about Inkjet refills. The linked article says that ink jet ink can cost up to $8,000 a gallon. The good news is that the free market has responded by providing a cheaper alternative, in the form of prints from drugstores and other traditional photo finishing places.

I stopped buying inkjet printers a couple years ago, because what would happen is that I would buy a printer and it would be good up until the ink ran out. Then I'd drop the $40-$60 on new cartridges and the printer would still refuse to work.

Back in my school days, I got an Epson MX-80 printer. It saw me through high school and college, and in college I was printing out musical scores that literally required the thing to run for hours with most pins firing. (Dot-matrix.) I had it for about five years and you had a whole lot of slack about when to replace the ribbon.

I stopped using it when it caught on fire.

Subsequent printers have lasted for a year to a year-and-a-half at the most. A few lasted only six months. It didn't matter if I spent $40 or $180, they never survived. And they have huge operational costs.

My solution was this little baby. It's actually not that little. It's quite bulky, but it's a network printer so we just set it up in a corner of the ktichen.

I paid about $300 when it went on sale at Staples. There are a couple of key elements to this purchase. One is that the printer is a "business printer" and not a "home printer". Since the margins are so low on home products, most companies sell marginal products with no support.

Another is that it had, a one year full exchange warranty. Anything went seriously wrong that first year, they'd completely replace it. The printers had to be much more robust for that offer to work out for Oki.

Also it had 24/7 tech support. I called on a Saturday with a problem. And the tech support was in Canada.

So, what about the cost? Well, after three years, at $100 an inkjet printer, it would have paid for itself easily. But in those same three years, we've only had to replace the toner once. Now, replacement toner does cost $120 for all the cartridges, but in the same three years I would've replaced the ink jet cartridges at least six times and as much as ten times!

So, at the most pessimistic, the printer has been no more expensive than an inkjet, minus the hassle and frustration of having the inkjet break. A more realistic estimate would be that it saved me several hundreds of dollars.

The Oki is unfortunately weak under Linux. That's about the only negative. It was such a positive purchase, I was tempted by a Samsung printer I saw the other day that was very similar, also $300 purchase price--but did duplex printing.

See, that's the way technology is supposed to be: Tempting you to upgrade with new features, not being so shoddy that you're forced to upgrade and buy overpriced supplies to keep a dubious business model afloat.

Tuesday, February 12, 2008

Strike...Out?

The writer's strike is coming to an end and once again, it looks like the losers will be...both sides!

You know, a few years ago here, we had a supermarket strike. It was lengthy. And if you were the sort of person who shopped at the supers (I wasn't), you had to cross a picket line to do your shopping. Regardless of what side you might be on, a picket line puts you in the middle of a group of strangers' conflict.

Predictably, people were driven to other stores. (That is the point, after all.) But as the weeks--and then months--wore on, people found out that these other stores, why, they were actually better than the stores they were used to shopping at. For example, Trader Joe's is generally cheaper, less tricksy, more interesting, and better staffed than a supermarket. Gelson's has pricier high-quality stuff, but isn't actually as expensive as it's reputation would suggest, as long as you stay away from the high-priced European imports. And you never wait in line long there.

And, as I predicted at the time--as someone who has long hated the big supermarket chains--people would find it hard to go back to these stores. Four years later, those stores are still not recovered from what I can see. Trader Joe's can't open up stores fast enough.

When this writer's strike started, I predicted a similar thing happening to the big nets. It's the weekly demand for shows that makes the writers' strike so effective, but the upshot would have to be that people would be driven to other forms of entertainment. Once driven away--and current ratings suggest the damage has been real and substantial--people won't come back. Not all of them.

The whole system needs revamping, of course. It's archaic to demand that people organize their lives around pre-recorded video programming, to watch it on your terms. And when you have a delivery system as big as the planet, one has to wonder about the merits of using an intermediary at all. Because that's what the nets are: intermediaries.

And in the case of cable, satellite dish, and so on, even more so. HBO makes a hit show like "The Sopranos" but if you want to watch it you have to incur all these other expenses. (Cable or dish, with installation, basic service and some kind of descrambler box.) Why can't a person just go directly to HBO, Showtime, the BBC, etc., and just get what they want directly?

Of course, this, in turn, negates cable channels as movie distributors: Why go to HBO when you can go to its source, Time-Warner, instead. The incestuous nature of these companies is probably a big culprit in delaying the technology.

So, for the consumer, the strike is probably a good thing, though not in the way the writers intend. But by accelerating the decline of traditional distribution channels and methods, they will eventually force the development of new channels to make up for it.

Friday, January 4, 2008

By The Pricking Of My Thumb Drive....

From my days as a full-time tech writer, I'm on Andy Marken's list for press releases and other fun stuff.

No, seriously, it can be a lot of fun. I'm not sure how I got on his list but he's smarter than the average flak. Amidst the breathless hyperbole--which is also entertaining if well written--he'll look at the real problem the product he's shilling for addresses.

One of those problems is that industry has never really progressed past the 3.5" floppy stage. Back in the early days of the home computer, when the Apple ][ was king and there were a zillion different varieties of home computer, we had 5.25" floppies. (Go back further and you'll find 8" floppies which, had they persisted, would've probably nipped a bunch of dumb jokes in the bud.) Capacity of those first drives was about 140K or, in modern terms, .00014 gigabytes. (You could cut holes in the side and then flip them over to double the capacity, though.) They were your only storage so there was no danger of outstripping them.

When the IBM PC came out the 5.25" went to 360KB (or 720 with a hole puncher!) and then pretty quickly to 1.2MB, no punch required. They had finally made hard drives that would read both side of the disk. Then came the 3.5" floppies, which went up to 1.44MB--still chump change by modern standards but with hard drives at 10-20MB, it meant you could back up your entire drive on less than 20 disks.

And usually much less. Back when your disk drive was largely filled with your own content, 20MB was a lot to fill. A megabyte will hold about a 800 pages of plain text--which it all was back then--and I used to write about 4-6 pages a day, plus whatever code I was programming.

But the 3.5" floppy was set up in '87 or so. And it was creaking by the time CDs rolled around in the '90s. Installing an operating system from 15-20 disks was cumbersome, and with hard drives in the hundreds of megabytes, backups were a laborious, error-prone process. Along comes the CD with its whopping 600MB of storage.

But CDs were a terrible medium for the sort of dynamic storage floppies excelled at. First off, you couldn't initially burn them. They were Read Only. Then you could burn them, but you couldn't necessarily read them anywhere except the machine you burned them on. The rewritable ones were especially fussy, to say the least.

And it probably should be noted that this wasn't entirely coincidental and due to technical matters. Content providers want a read-only medium, and they want to be the sole source of that medium. To this day, some of us pay taxes on blank media as a result. (The mindset of this is worth exploring but this screed is already going long so I'll save it.)

The technology is mostly ironed out at this point with CDs and DVDs, but one again, at 4GB, a backup can go into the dozens and, they're still not very good at rewritable. They're also slow and fragile. (Back in the 3.5 days, you could slam a floppy into a drive and almost immediately start to read/write from it. Plus you could run over it with your car, burn it, put it through the washing machine and other horrors, with a reasonable expectation it would still work.)

So, I do think this solution is kind of cool:

It's not a cheap, disposable medium that runs about 10% of common disk size, true. We may never have anything like that again. And it's not a Flash Drive, which are stuck at 8GB.

What it is is a physically tiny hard-drive that ranges from 120GB-320GB and that runs at a slightly slower speed than most drives today. With a price that's around 80 cents a gig for the smallest configuration (and probably lower for the larger ones.)

I guess the two reasons I find this interesting is: 1) I've been involved in tech long enough to remember the big clunky hard-drives of the past; 2) We're getting to the point where the mechanics of the drive itself are an increasingly small part of the cost. (A floppy disk, of course, is just dumb media that needs to be inserted into the drive, where this carries its own mechanics with it.)

It's not inconceivable that the "floppy of the future" will be a self-contained device like this, not only containing its own drive system but its own operating system. Stuff like this already exists, actually, but will it (or anything) ever hold the position the 3.5" floppy did?

And will it survive if you put it through the washing machine?