I use the iTunes ratings quite extensively for organizing my music collection with smart playlists. Of late I started seeing some songs I had never rated in my smart playlists and they show up with grey stars in iTunes that can’t be removed from the song directly. On the iPod/iPhone/iPad it’s even worse as they show up as normal ratings making you wonder how your library got messed up.
Turns out that it was apparently due to the album ratings being applied automatically to the songs. In fact, this feature has been around for a while, but one of the recent iTunes updates seems to have created ratings for albums on its own and messed things up. Fret not, as there’s a simple solution – just go to the albums view and remove the album rating. Any songs manually rated in the album don’t get affected of course.
Source: How to remove the automatic ratings from songs in iTunes? – Ask Different
I had this vexing issue where the Google Now notification would show the temperature in Fahrenheit on my desktop Chrome browser notifications. It seems that this is a known bug, and is not fixed properly. The workaround that did the trick for me (as highlighted in this comment) was to switch to Fahrenheit in Google Now on the mobile app and then switch back to Celsius again. The desktop notification finally started displaying the temperature in Celsius. A very silly issue, and a reminder that practically all the OSes (both mobile & desktop) are now being developed in a country that does not follow the metric system.
One of the commonly believed origins of the term “bug” (as used in the software context) is the anecdote of a dead moth being causing the malfunction of the Mark II computer in Harvard in 1947. Well, it turns out that the term bug is much older and predates the computer industry by a pretty big margin. I found this out while reading Robert Cringely’s weekly post on Apple’s HD strategy (it is an interesting article by the way) where he’s mentioned the incident and the possible origins as a side note:
It turns out that “bug” was a common term for hardware glitches and dates back to the 19th century and possibly before. Edison used the term in a letter he wrote in 1878. This is no earthshaking news, of course, but simply reminds me how self-centered we are as an industry and there really isn’t much that’s truly new.
Wikipedia also mentions more or less the same thing in the etymology section for software bugs:
While it is certain that the Mark II operators did not coin the term “bug”, it has been suggested that they did coin the related term, “debug“. Even this is unlikely, since the Oxford English Dictionary entry for “debug” contains a use of “debugging” in the context of airplane engines in 1945 (see the debugging article for more).
So that clears up some word origins I guess, but the computer industry can take consolation in finding the first “real” bug :-).