Detecting Earthquakes

This is a really interesting use of the accelerometers that are being built into more and more laptops these days. From an article in the Economist:

A group of researchers led by Jesse Lawrence of Stanford University are putting the same accelerometer chip to an intriguing new use: detecting earthquakes. They plan to create a network of volunteer laptops that can map out future quakes in far greater detail than traditional seismometers manage.

MSNBC also has an article.

What is interesting to me here is the use of lots of (cheaper) devices to do the work of a few expensive devices (Seismometers in this case):

On its own, an accelerometer chip in a laptop is not very useful for earthquake-detection, as it cannot distinguish between a quake and all sorts of other vibrations—the user tapping away at the keyboard, for example. But if lots of these chips are connected to a central server via the internet, their responses can be compared. And if a large number in a particular place register a vibration at almost the same time, it is more likely to be an earthquake than a bunch of users all hitting their space bars.


Changing the Safari’s Default Search Engine

Even though I use both Safari and Firefox regularly, I have to admit that Safari is my primary browser, primary browser being the one which contains my reference set of bookmarks. But the one thing that irks me about Safari is that I cannot change the default search engine, out of the box you are stuck with Google.

So I went looking this morning to see if there was something that could be done about this. It turns out that Inquisitor by Dave Watanabe does just that. It add functionality to the search box, giving you auto-completion and additional results from other search engines, and allows you to customize which search engine you want to use within limits. Since it is now owned by Yahoo, you are limited to Yahoo and Google, but it does allow you to narrow your choice down to a particular country if you desire which is a nice touch. It would have been nice to be able to add additional search engines, but it is a start and since it is owned by Yahoo you can understand why this is not supported.

A couple of questions remain though. There does not seem to be an easy way to un-install it, and I am not sure if Apple software update process will choke on that when the next version of Safari is released.

Maybe it is time to make that switch to Firefox.

Updated – Sept 28, 2008 – I have sent some feedback asking how to un-install it, in the meantime I think all you need to do is remove the ‘/Library/InputManagers/Inquisitor’, ‘/Library/Receipts/inquisitor.pkg’, and ‘/Library/Receipts/inquisitorPreflight.pkg’ folders. It seems to have done the trick for me.

State of the Blogosphere

Technorati has published a “State of the Blogosphere” for 2008 (via SearchEngineWatch.)

Worth taking a quick look.

Quit Complaining about the App Store

TechCrunch tells us to stop complaining about the Apple App Store:

So to developers out there, those who love the App Store and those who loathe it, recall that the power of the platform is to create a system which connects software developers and consumers of that software. But when someone makes a platform, they also control the rules which go along with that platform. And right now the platform to be on is the iPhone platform. To those who are unhappy with the restrictions that Apple places on the App Store, don’t complain. Just keep on coding.

Honestly I can see both sides of the issue here. Apple’s App Store is, well, Apple’s App Store and they are free to list, refuse to list and delist any application they want, after all they own and pay for the infrastructure, and they definitely have an interest in maintaining a certainly level of quality. On the other hand Apple has not been completely transparent with its selection criteria which might make some people think twice about developing certain apps.

Java Performance, Lack Thereof

I am having an interesting issue with the performance of a Java program. The job of this program is to suck a lot of data out of a database and dump it to a file (with some formatting). Easy enough but for some reason it is really slow an defies speeding up.

The first issue is that the program manages about 25-50 queries/second against the MySQL server (as measured buy innotop). I know the MySQL server can manage about 2,500 queries/second because I logged all the queries emitted by the dump process (using ‘log’ in my.cnf,) extracted all queries (the log includes tracking information,) and ran that using the ‘mysql’ command line tool.

What is odder is that when I check the state of the machine itself using ‘atop’ I can see that the Java process is using about 5-10% of one CPU, and MySQL is using about 2%-5% of one CPU, so the machine is basically doing very little. Yet the process just does not want to speed up.

A colleague suggested running multiple threads in the program which I tried but I still get the same crummy performance, actually the performance got a little worse.

This is more than a little frustrating as I come from the C side of things where a program will pretty much us all the CPU power it can get if you let it, and I like that ‘pedal-to-the-metal unless restrained’ approach. Java seems to take a ‘restrained at all costs’ approach for some reason.

It looks like I am going to have to rethink my approach completely here.

Silver Tip Shark

This is a Silver Tip Shark, taken at the Silverado dive site on the Cocos Island.

A few things are interesting about this shot (and the others taken there).

Silverado is a very shallow dive site, close to the shore, the bottom sits at around 30 feet and there is a ridge of rock jutting out from the shore topping out at around 15 feet from the surface, so a very shallow site as I said. If you manage your air carefully you could be there for 2 hours and since it is so shallow you would accumulate very little nitrogen in your body (we were all on nitrox so we probably had zero accumulation, but I would need to check the dive tables for that.)

This is a site is a cleaning station for Silver Tip Sharks, they come in and get cleaned by smaller fish (I have written about cleaning stations before). Since the sharks are larger fishes it follows that the cleaner fisher are larger too. What is interesting is that Silver Tip Sharks are blue water sharks who usually don’t come close to land, so this is unusual. They were not bothered by us and I was able to get some good shots.

Finally this was the last dive of the trip (we had to beat someone else off the site, which was fine since we had the right of way) so we wanted to make sure we got all we could out of it. Of course fate was not going to be so easy on us and the thermocline has risen here and the water was quite cold, and I had forgone the second layer on my wet suit because I was too warm when we dove the very same site two days earlier.

Map-Reduce with a Different Flavor

Not sure how I came across Disco, but it somehow landed in my bookmarks of things to check out. Normally I would not post something about Map-Reduce, there is already lots of easy-to-find stuff out there about it, but this one was interesting:

Disco is an open-source implementation of the Map-Reduce framework for distributed computing. As the original framework, Disco supports parallel computations over large data sets on unreliable cluster of computers.

The Disco core is written in Erlang, a functional language that is designed for building robust fault-tolerant distributed applications. Users of Disco typically write jobs in Python, which makes it possible to express even complex algorithms or data processing tasks often only in tens of lines of code. This means that you can quickly write scripts to process massive amounts of data.

Disco was started at Nokia Research Center as a lightweight framework for rapid scripting of distributed data processing tasks. This far Disco has been succesfully used, for instance, in parsing and reformatting data, data clustering, probabilistic modelling, data mining, full-text indexing, and log analysis with hundreds of gigabytes of real-world data.

The two things which caught my eye were Erlang (which seems to be getting more and more traction these days, maybe the next language to learn,) and the fact that it used Python as the ‘driving’ language.