FrankenPod

This is the follow up posting to my tale of two iPods.  Tonight I used the ‘extra hour‘ to perform the logic board swap on my busted iPod from the ‘used’ one I picked up a week ago.

The first step was to take the used iPod apart.  By starting with the used one, I could learn how to do it before taking mine apart. I knew the used one was a little beat up, but I was still surprised by how dirty it was inside as well (mostly lint).  Below is a picture of it fully disassembled.

I found a pretty good .pdf file on powerbookmedic.com that documented the tear down, but found I had to also refer to the ifixit.com site to get a better idea of how some of the cable clips worked.  Now that I’ve got hands on experience doing it, the process is pretty straight forward.  Even cracking the iPod apart is quite easy now.  I do need to point out how insanely small those 6 screws are.

After taking everything apart – I wanted to verify the donor logic board was working. In the picture below you should be able to see the “Please Wait.  Very Low Battery.” message.  If I tried this with my non-working logic board, the screen did not light up at all.

Once I had both iPods completely disassembled, I performed the logic board swap and began to re-assemble my (hopefully working) iPod.  The reason for going to this extreme is that the used iPod was pretty beat up, and the only part I wanted to take from it was the logic board.  In the picture below, I’ve swapped the logic board and reassembled the click-wheel and screen into the front panel.

From this point it was only a matter of minutes before I had a completely assembled iPod and was able to connect it to my PC.  Again I was greeted by the ‘very low battery’ messaage and a pretty long wait – long enough I was starting to think I had done something wrong.

I was relieved when enough juice had made it into the battery and my Ubuntu system recognized the device.  From this point on things were pretty smooth sailing.  I booted up my vmware image of WindowsXP that I use for iTunes and there were no problems connecting and synchronizing the iPod.  I had wondered if the logic board would be tied into my serial number, but apparently the data on the drive alone defines the iPod.

I then proceeded to reassemble the used iPod using the bad logic board.  While I will likely sell this used one for parts, I figure it may as well be together instead of a jumble of parts.  Once I did this, I was surprised to hear whirring coming from the device (sounded like the hard drive spinning).  It was unresponsive to the reboot sequence (menu + center) and continued to whirr away.  I figured this explains why my iPod had such a flat battery, clearly when the logic board failed – it went into this mode and drained the battery completely.

A few minutes later I realized that the used iPod was getting warm to the touch.  This was a little bit alarming, so I popped the cover off and disconnected the battery.  The battery was quite hot, clearly some unexpected load being drawn by the bogus logic board.

At this point it looks like things went as planned.  I was able to transplant the logic board from the used iPod which was pretty beat up into my “like new” iPod.  I also have pretty good evidence to back up my guess that it was the logic board.  And I was able to do it for less than a refurb nano would cost me.

I may still use the shuffle as it is certainly easy to carry around, but I’ll certainly appreciate the video capability on my next boring plane ride.


Java Performance in 64bit land

If you were buying a new car and your primary goal was performance, or more specifically raw power – given the choice between a 4 cylinder and a 8 cylinder engine, the choice is obvious. Bigger is better. Generally when we look at computers the same applies, or at least that is how the products are marketed. Thus a 64bit system should out-perform a 32bit system, in the same way that a quad core system should be faster than a dual core.

Of course, what a lot of the world is only starting to understand is that more isn’t always better when it comes to computers. When dealing with multiple CPUs, you’ve got to find something useful for those extra processing units to do. Sometimes your workload is fundamentally single-threaded and you have to let all those other cores sit idle.

The 32bit vs. 64bit distinction is a bit more subtle. The x86-64 architecture adds not only bigger registers to the x86 architecture, but more registers. Generally this translates to better performance in benchmarks (as having more registers allows the compilers to create better machine code). Unfortunately until recently, moving from a 32bit java to a 64bit java mean taking a performance hit.

When we go looking at java performance, there are really 2 areas of the runtime that matter: the JIT and the GC. The job of the JIT is to make the code that is running execute as fast as possible. The GC is designed to take as little time away from the executing of code as possible (while still managing memory). Thus java performance is all about making the JIT generate more optimal code (more registers helps), and reducing the time the GC has to use to mange memory (bigger pointers makes this harder).

J9 was originally designed for 32bit systems and this influenced some of the early decisions we made in the code base. Years earlier I had spent some time with a PowerPC system that ran in 64bit mode trying to get our Smalltalk VM running on it, and had reached the conclusion that the most straight forward solution was simply to make all of the data structures (objects) twice as big to handle the 64bit pointers. With J9 development (circa 2001), one of the first 64bit systems we got our hands on was a Dec Alpha so we applied the straight forward ‘fattening’ solution, allowing a common code base to support both 32bits and 64bits.

A 64bit CPU will have a wide data bus, but recall that this same 64bit CPU can run 32bit code as well and it still has the big wide data bus to move things around with. When we look at our 64bit solution of allowing the data to be twice as big, we’re actually at a disadvantage relative to 32bits on the same hardware. This isn’t a problem unique to J9, or even Java – all 64bit programs need to address this data expansion. It turns out that the dynamics of the java language just tend to make this a more acute problem as java programs tend to be all about creating, and manipulating objects (aka data structures).

The solution to this performance issue is to be smarter about the data structures. This is exactly what we did in the IBM Java6 JDK with the compressed references feature. We can play tricks (and not get caught) because the user (java programmer) doesn’t know the internal representation of the java objects.

The trade off is that by storing less information in the object, we limit the total amount of memory that can be used by the JVM. This is currently an acceptable solution, as computer memory sizes are nowhere near the full 64bit address range. We only use 32bits to store pointers, and take advantage of 8 byte aligned objects to get a few free bits [ pointer << 3 ]. Thus the IBM Java6 JDK using compressed references (-Xcompressedrefs) can address up to 32Gb of heap.

We’re not the only ones doing this trick, Oracle/BEA have the -XXcompressedRefs option and Sun has the -XX:+UseCompressedOops option. Of course, each of the vendors implementations are slightly different with different limitations and levels of support.  Primarily you see these flags used in benchmarking, but as some of our customers are starting to run into heap size limitations on 32bit operating systems they are looking to move to 64bit systems (but would like to avoid giving up any performance).

There is a post on the websphere community blog that talks about the IBM JDK compressed references and has some pretty graphs showing the benefits. And Billy Newport gives a nice summary of why this feature is exciting.

Retrospective

I’ve decided to start including work related items in my blog here.  Please view the About page to see the standard disclaimer.  A fair number of the things I do at work can’t be discussed in public until they arrive somewhere in product form, by then they usually feel like old news to me and often information has been leaked via other channels.  Thus, some of the work related posts might seem a little boring to those “in the know” but I hope to help people put 3+4 together by linking to various bits of information, or simply provide a “straight from the developers” viewpoint.  I guess we’ll see how it goes, requests and feedback are welcome.

Recently Rick DeNatale posted a nice personal history of Smalltalk, I’m quite proud to have been part of building several of the products he mentions.  The OTI VM team started out building Smalltalk, but moved on to Java by first doing VisualAge for Java, followed by J9 for embedded.  Currently we still actively develop J9 which is primarily used as the core of the IBM JDK, but we still do embedded work as well as a Real-Time Java offering.