Comments 1 - 30 of 30 Search these comments
Also don't auto update.
When I had XP if I ran auto update, that's when I got viruses and the system slowed down and crawled along and got so sluggish I'd have to reinstall everything every three months. The boxes where I had that all disabled never had any of those issues.
I have Auto update disabled on this Win7 box going on a year and a few months old. Been to hell in back on the internet and have never once gotten a virus, trojan or malware. (that I know of).
I only update when I'm forced to, because something I'm trying to accomplish or install requires that update.
I'll become a LAMP guru before I'll be some Balmer chump end user of some crappy monkey see monkey do child toy crap.
And Windows 8 has crossed that line. If the hardware outgrows Windows 7, and the direction of MS operating systems stay the course of Windows 8.
Some of the least tech-savvy people I know are moving to tablet-type OS's, either iOS or Android, to do everything. Plug a keyboard into, say, the Galaxy Note and you can do most anything the majority of users need a machine for.
I wonder in Microsoft has been anticipating a loss in home market share, and stopped caring about selling Windows. Or maybe the Zionists got to them?
I wonder in Microsoft has been anticipating a loss in home market share, and stopped caring about selling Windows.
Microsoft will never abandon Windows as long as it's doing so well on the server side. Think of all the Windows Server and MS SQL Server licenses there are out there. Tons of profits.
Windows has lost the battle for tablets and smart phones and desktop OSes are becoming unimportant for anyone who is not a developer, although for developers, including developers of IPhone and Android apps, the desktop is still as important as ever because that's where your develop tablet and smart phone software.
Some of the least tech-savvy people I know are moving to tablet-type OS's, either iOS or Android, to do everything.
so how is the preemptive muti tasking ... LOL! .. kind of kills "do everything".
Windows as long as it's doing so well on the server side
wat why do servers need $6000/box GUIs LOL
I have 10.9 in front of me and Windows 7 immediately to my left.
I used to dual-boot into Windows Server 2012 but the active ugliness of that GUI forced me to retreat to Windows 7, even though the driver model on 8 does seem a lot more solid than the 2009-era Windows 7 install I have.
I just hated how 8 looked exactly like a tarted-up 3.1 desktop, with lack of drop-shadows, etc.
Apple is still 5 years ahead of Microsoft here in general UE quality.
Funny thing is I absolutely love Microsoft's 2017 vision:
but Windows 8 doesn't really seem to be on that path.
Think of all the Windows Server and MS SQL Server licenses there are out there. Tons of profits.
It's startling how fast things can change. Year ago I worked a lot on IRIX, some on Solaris, also occasionally HP/UX and A/IX.
Now SGI doesn't exist, and most of those other OS are items on my resume that are historical curiousities for people. Occasionally I get these contract gigs where a legacy system needs rescue and they need someone experienced to dig them out of a hole, but those are few and far between.
If fashions change I could easily see Windows shrinking it's presence by 90% in as little as 3 years.
wat why do servers need $6000/box GUIs LOL
I don't follow you.
Apple is still 5 years ahead of Microsoft here in general UE quality.
People who think that Apple is or ever was ahead of the game have no idea how computers, including Macintoshes, work.
Just consider multithreading.
Implemented in IBM's OS/2 Warp in 1994.
Implemented in Microsoft's Windows 95 in 1995.
Implemented in Sun Microsystem's Solaris around 1994.*
Implemented in Apple's OS 8 in 1997.
* Sun did some bullshit "fiber" implementation and marketed it as "green threads", but they weren't real threads. Later Sun implement real threads in their OS's kernel.
Apple and its operating system has always been years behind the real software makers. Of course, Apple fanboys don't even know what multithreading is or why it is critical because it has nothing to do with painting pixels on the screen. If it ain't fashion, it ain't important to a MAC user.
Apple has a very long history of
1. Being behind the times technologically.
2. Taking credit for other people's inventions (the GUI, the circular slider, the MP3 player, etc.)
3. Brainwashing their users into thinking that flash is more important than substance and that Apple's flash is the best flash, even when it's just a rip-off of what someone else came out with 5 to 10 years ago.
If fashions change I could easily see Windows shrinking it's presence by 90% in as little as 3 years.
In the desktop/laptop and the tablet/smartphone arena, I'd agree. But there is no way Windows is going to lose 90% of its share in the server realm over even the next 5 years. There's too much time, effort, and money invested in Windows platforms as the hosts of websites, web services, and other server-based technology. Same thing with MS SQL Server.
Even the fact that you can get free as in beer database servers hasn't put a dent in MS SQL Server demand. And those licenses are damn expensive!
Windows will remain important in the virtualization and cloud markets, especially with HyperV. And that alone will produce enough profits to justify its continued development.
I don't follow you.
http://www.newegg.com/Product/Product.aspx?Item=N82E16832416797
Just consider multithreading.
let's
technical detail, Lisa had multithreading in 1983 but it sucked because the overall system was a pig and ran at 5Mhz too.
Windows 95 had multithreading but all 16-bit code had to take out a single lock so it was a critically flawed implementation.
Meanwhile, in 19 fucking 89 I brought home a IIcx that "cooperative multitasked" perfectly fine, making it maybe 10 years ahead of the industry. I had 5MB in that Mac, and could have open a full working set of apps and they all ran together at the same time, taking "null events" in their event loops to do small units of work in the background.
It was a PITA for app developers, but until networking got going in the mid-1990s background processing really wasn't a common use case for PCs anyway.
Now, when NT4 came out in 1996, I could admit that Mac 6.0.3 was finally eclipsed technically (and sadly, System 7.6 or whatever that was out then wasn't all that better than 1989's 6.0.3), and OS 9 / X remained behind NT4 & Win2k until the 2002-2003 period.
If it ain't fashion, it ain't important to a MAC user.
LOL. Funny how since 1986 Apple has had the better PC on the market for me.
1986 Mac Plus vs PC AT -- Plus is tons more usable.
1989 I was in the market for a new machine and the Iicx was tons more usable than the MS-DOS / 386 combo. No competition, even though I had to pay around $1000 more "Mac Tax" (even with Apple's nifty student discount) to get the better system.
1995 the internet etc. meant it was time to upgrade from VGA to SXGA, and 68030 to PPC or Pentium 133. I was using Windows 95 at work and didn't need that jazz at home, so I stuck with Macs. Got a 7500, which bridged me through 1999.
But TBH 1999-2002 was Apple's nadir really. Wintel was hitting its peak ca. 2000 with the 440BX + NT4 platform, while Apple was trying to get NeXT stuff on Macs. (Intel gave Apple some breathing room with its RAMBUS/P4 excursion)
As for Apple being behind, LOL. Ol' Steve blew everyone away twice last decade; first with the iPhone in 2007 and then with the iPad in 2010.
Innovation is different from invention. The perfect illustration of this is another Jobs landmark innovation -- the laser printer.
Yes, HP beat Apple to market by a full year with the LaserJet in 1984.
But it sucked -- not being networkable, and its 128K of RAM only allowed for postage-stamp sized graphics.
And not being PostScript capable, or particularly well thought out at all, the HP's font support sucked balls. I don't want to even get into it since the details will clearly sail over your pointy head ('fonts are fashion amirite'). But suffice it to say that the HP was not going to ignite any "desktop publishing" revolution -- it was just a glorified LPR. I know, because that's exactly how I used them back in my college lab that had a LaserJet.
Apple has historically done a damn good job bringing compelling technology to market.
1979's Apple II+, 1985's LaserWriter, 1986's Mac Plus, 1987's Mac II . . . 1989-1994 NeXTSTEP/OPENSTEP . . . 1994's Powerbook 500 series . . . 1995's PCI PowerMacs . . . 2001's Powerbook G4 . . . 2003's iTunes Music Store . . . 2007's iPhone, 2010's iPad, 2006 - now Macbook Pro
The above is why Apple has the market cap of Oracle, MSFT, and INTC combined, not because Apple's customers don't understand technology as well as you, LOL.
FWIW I think Apple's hw advantage only really exists in the Macbook Pro -- the new Mac Pro is about 2X overkill for my needs and I don't understand what they're trying to do there.
(The iMac line with its embedded LCD also doesn't have much interest for me.)
http://www.newegg.com/Product/Product.aspx?Item=N82E16832416797
Shop around.
http://www.amazon.com/Microsoft-Windows-Server-Center-License/dp/B0093CAKMY
But yes, that's an expensive license especially for only two processors.
technical detail, Lisa had multithreading in 1983 but it sucked because the overall system was a pig and ran at 5Mhz too.
That's actually a myth. Multithreading -- true multithreading, not bullshit "green threads" or "fibers" or "cooperative multitasking" -- requires, almost by definition, pre-emptive multitasking, which in turn requires hardware support including hardware interrupts.
The Lisa used "fibers", which were a (imho, lame) cooperative multitasking implementation in which each application had to explicitly yield to the OS. The OS could not interrupt an application because the CPU didn't allow for it.
On the PC side, the first machines capable of pre-emptive multitasking were the x386s, which is why you don't see any pre-emptive or multi-threaded OS running on anything earlier than a x386.
I don't know when the hardware support was introduced into the Motorola chips that the MACs used to be based on, but it certainly wasn't in the Lisa hardware.
And quite frankly, if the Lisa had had multithreading, it would say something even worse about Apple for going backwards for 15 years!
Here's a very short list of the things that Apple pretends it invented but didn't, along with the truth.
1. The graphical user interface. Really invented by Douglas Engelbart at Xerox PARC.
2. The mouse. Also by Engelbart.
3. Hypertext (through Hypercards). Also by Engelbart.
4. Screen sharing. Also by Engelbart.
Hell, pretty much everything that happened at Apple at Steve Job's first stint there was stolen from Engelbart.
5. Multitasking
6. Setting pictures as desktop wallpaper.
7. The iPod scroll wheel is just a circular slider, a control that appeared in OS/2 going back to the early 1990s, if not late 1980s.
8. The iPod itself was a rip-off of the Diamond Rio, the first MP3 hardware player.
9. The smart-phone icon design. Palm OS had a large screen with icons on a home page. So did other smart phones.
10. Sliding to unlock
11. The notification bar
12. Black phones with minimalist hardware controls and flat icon UIs.
13. Siri
14. Even their advertising.
15. Video calling
16. Hell, Apple even falsely claims to have invented the term "App" and "App Store". That's pretty low. It's like claiming to have invented sex.
Perhaps Steve Jobs himself said it best back when he had just a little bit of honest. We have always been shameless about stealing great ideas. And yes, indeed, every great idea Apple has ever had, it stole from someone else.
If Netflix finally converts to HTML5 I'm rebuilding my Windows XP Media Center PC that drives my LCD into an OpenSuse machine.
HTML5 Video Playback UI
http://techblog.netflix.com/2013/10/html5-video-playback-ui.html
technical detail, Lisa had multithreading in 1983 but it sucked because the overall system was a pig and ran at 5Mhz too.
that would be Motorola's chip design .. Apple never made or designed semiconductors.
APOCALYPSEFUCK is Comptroller says
Berkeley Unix with Tits! Fuck Ya!
that about says it all... everyone was copying UNIX...
how do you know.. everyone said "we are not yet there"...
they were always comparing their product to the advanced
unix platform.
excellent episode of Computer Chronicles...
1979's Apple II+, 1985's LaserWriter, 1986's Mac Plus, 1987's Mac II . . . 1989-1994 NeXTSTEP/OPENSTEP . . . 1994's Powerbook 500 series . . . 1995's PCI PowerMacs . . . 2001's Powerbook G4 . . . 2003's iTunes Music Store . . . 2007's iPhone, 2010's iPad, 2006 - now Macbook Pro
The above is why Apple has the market cap of Oracle, MSFT, and INTC combined, not because Apple's customers don't understand technology as well as you, LOL.
Bankrupt by mid 90s.. waiting for a suitable buyer be it SUN or who ever.
They lost their corporate clients... and still dont have suitable hooks into corporate servers..Oracle, SAP, and multitude of WinTel apps/ platforms. That is why they are more or less a consumer product and not a cross over to business market.
Even their own SAP ERP system, their corporate backbone, was running on WinTel products... as they were also running IBM mainframe back in the 80s. Shocking!!!
They might have switched since using intel X86 chips.. who knows now !!!
Today, Apple is lucky.. yes Lucky... count the number of Computer makers gone RIP.. Amiga anyone ?
APOCALYPSEFUCK is Comptroller says
Apple is still 5 years ahead of Microsoft here in general UE quality.
Berkeley Unix with Tits! Fuck Ya!
You mean "Berkeley "Eunuch" with Tits!"
Hell, pretty much everything that happened at Apple at Steve Job's first stint there was stolen from Engelbart.
Everyone steals from someone else.. Non compete agreements (NCA) have not been enforceable in CA. It should be obvious from all the small company founders coming from larger former employers and competing directly for years to come. Perhaps the only reason we managed to survive during the early 90s recession while Route 128 (Boston) went to ashes..
I don't know when the hardware support was introduced into the Motorola chips that the MACs used to be based on, but it certainly wasn't in the Lisa hardware.
heh, this is correct.
http://lisaem.sunder.net/cgi-bin/bookview2.cgi?zoom=0?page=8?book=28?Go=Go
The Lisa OS would yield the CPU apparently, but it wasn't preemptive. Lisa OS was fully reentrant though, and had memory protection so the team understood what they were doing (alas, the original Mac team were kinda hacking things together as they went along, leaving tons of rough edges to be fought against for a decade+).
To simulate PMT the app programmer had to insert explicit "YIELD_CPU" OS calls at strategic points in the code. Implementation detail though.
At any rate, these details are *just* details and do not make up the bulk of the user experience or ability to actually get useful work out of the machine.
What matters is how well the system works as a whole, and what people like you with bullet-list buzzword mentalities fail to understand is that this holistic nature of systems means 'good enough' across the board will always beat a system with 'perfect' implementation in all but one critical area area of failure, just like two pair in the hand will beat a busted straight flush draw.
The Amiga had arguably better multitasking than the Mac, but was horrifically crippled in many other areas such that nobody cared in the end.
pretty much everything that happened at Apple at Steve Job's first stint there was stolen from Engelbart
again, innovation is not invention. Successful Innovation is getting products shipped at price points people are willing to buy.
Apple has ALWAYS -- since 1977 -- been the industry leader here.
They didn't invent 802.11 but they saw the utility and did a helluva lot of work getting it productized in MacOS years ahead of Wintel.
Reviewing the past, I can say Apple missed the boat with the modern optical mouse -- invented by HP and successfully introduced to the mass market by Microsoft.
They also dropped the ball pretty bad with consumer 3D graphics, shipping the original iMac with truly crappy Rage IIc, being slow on the AGP bandwagon, and even today not really being aggressive about staying compatible with the mainstream desktop graphics card market.
Microsoft also stole a march on them with their DirectX effort in the 1990s, essentially inventing the concept of API as developer marketing material.
But balanced against that are all the things Apple has executed well on these past 20+ years, like buying NeXT to get its Unix-ish underpinnings ported to Mac.
The iPod itself was a rip-off of the Diamond Rio, the first MP3 hardware player.
ahaha haha haha.
"It shipped with 32 MB of internal memory and had a SmartMedia slot, allowing users to add additional memory. It was powered by a single AA battery which provided between 8 and 12 hours of playback time. Connection to a personal computer was through the computer's parallel port, with a proprietary connector on the Rio's edge."
30 minutes of 128kbit music. Parallel port connection with shitty music management -- on the device and on the PC.
Par for the course! Being a neckbeard you don't have the first inkling what innovation looks like, even when Apple hits a product out of the fucking gravity well like with the iPod.
The Rio demonstrated what the category could be, but sucked in important areas.
The original iPod did not suck. People didn't buy it because it looked good, people bought it because they could put 5-10GB of music on it, and firewire (and later, USB2) was more suitable connection than a fucking parallel port.
As for the 2007 iPhone, what Apple actually innovated is too subtle and zenlike for you to actually understand -- what they brought to market was the first touch UI. What they innovated in 2005-2006 was the absence of the stylus!
They did this by identifying what advances were necessary and productizing them in a very well-executed way.
Corning's Gorilla Glass. Just like the canon laser engine (and Toshiba's 1.8" hard drive in the iPod), Apple didn't "invent" this but they certainly brought it to market fully developed.
Multi-touch capacitive sensors. HVGA @ 160DPI on a 3.5" full form-factor screen -- my Samsung A900 had 160DPI but at QVGA while my HTC PPC6700 had HVGA but on a too-small 2.8" screen because Microsoft didn't understand
that touchscreen phones no longer needed so many hardware buttons (same mistake Android people were still making in 2006, too).
But the real secret sauce of the iPhone was the PowerVR graphics stack that gave the system its extreme, pause-free responsiveness, something only Android now is approaching and Microsoft had to totally shitcan a decade of phone OS work to approximate.
And no, Apple wasn't first to ship PowerVR on a phone, that honor actually belongs to Nokia with the N95 and N800, but of course Nokia fucked it up by forgetting to actually develop a usable driver stack for the OMAP's PowerVR package.
"With regard to the driver, it's buggy and messy and won't be released as it would be be troublesome to support (that's my impression); to get it rewritten into a better form would cost money."
http://talk.maemo.org/showthread.php?t=21697
Meanwhile Apple was developing its CoreAnimation API to facilitate all that swoopy GUI goodness that you don't like, LOL. Plus also shipping a great OpenGL ES implementation (which my genius friend at Apple actually wrote, btw) that really helped the platform once the AppStore (another critical Apple innovation, btw) got going.
Oh, I see you got to the AppStore thing. Whatever they called it, as an independent app developer I can tell you Apple certainly revolutionized the industry with their AppStore, even more than how they revolutionized how we use PCs with their fostering of desktop publishing in the late 1980s.
Yes, PalmGear etc. existed before the AppStore. As did other 3rd-party efforts:
http://www.mactech.com/articles/mactech/Vol.20/20.12/ECommercePartner/index.html
Apple came in like a wrecking ball in 2008 and delivered what ISVs really needed, and left everyone from Amazon to Microsoft to learn from the masters what innovation was all about.
As for the theft, I could care less. As Chairman Steve said, "great artists steal".
What is meant by that is when you are copying something, like what Samsung does, you are making something inferior.
But when you steal something, you take it and make it your creation through reworking it to what you want it to be. Apple certainly stole the GUI idea from Xerox (even though they paid for it via a contractual stock sale deal), but with the Mac (well, Mac Plus) they actually shipped a machine and OS that was productized sufficiently that the GUI could be usable by a critical mass of people, not just workstation users like with the Star.
if you think Apple copied the Rio or Nomad, you simply have no clue how tech really works.
Neither did Commander Taco, back in the day, so you're certainly not alone.
Apple certainly "stole" from them, LOL. Showed them how to actually make a usable product, that's what.
But when you steal something, you take it and make it your creation through reworking it to what you want it to be. Apple certainly stole the GUI idea from Xerox (even though they paid for it via a contractual stock sale deal), but with the Mac (well, Mac Plus) they actually shipped a machine and OS that was productized sufficiently that the GUI could be usable by a critical mass of people, not just workstation users like with the Star.
Had Apple been in Boston, Mass they would have lost every court case and lost money. Never would have made it.
Generally speaking mac users have always been the less technically inclined, and in need of the most user friendly interface to operate within, while keeping the nuts and bolts of the system out of reach of the typical user.
However, with the airbook series of laptops and ipads they've carved a hardware niche that sits on top of the stack. I started with the 11" macbook air, and handed that to my wife as I upgraded to the 13" version. Both systems are set up to dual boot Mac/Windows 7.
99% of the time we use win7, with the occasional foray into the mac os for shits and giggles....Those airbooks are built like shit brickhouses yet are light with the sharpest screens available...
Dan8267 says
Apple and its operating system has always been years behind the real software makers. Of course, Apple fanboys don't even know what multithreading is or why it is critical because it has nothing to do with painting pixels on the screen. If it ain't fashion, it ain't important to a MAC user.
I agree with all this except I would include the "desktop/laptop" with the servers. 75% of computer sales are for business, which has the same software investment as the server business.
Fashion won't mean jack as far as business changing OS's...
If fashions change I could easily see Windows shrinking it's presence by 90% in as little as 3 years.
In the desktop/laptop and the tablet/smartphone arena, I'd agree. But there is no way Windows is going to lose 90% of its share in the server realm over even the next 5 years. There's too much time, effort, and money invested in Windows platforms as the hosts of websites, web services, and other server-based technology. Same thing with MS SQL Server.
People don't buy macs because they want the latest technology has to offer. They buy macs because they don't know the difference.
3. Brainwashing their users into thinking that flash is more important than substance and that Apple's flash is the best flash, even when it's just a rip-off of what someone else came out with 5 to 10 years ago.
The Amiga had arguably better multitasking than the Mac, but was horrifically crippled in many other areas such that nobody cared in the end.
There is no argument. Real pre-emptive multitasking albeit without memory protection. The OS was super lean and efficient.
Nobody cared in the end? Horrifically crippled? How? Lack of high non-flicker resoultions were a problem initially for business applications. Also hard drives initially were very expensive being SCSI only. These issues were remedied later.
And Amiga was widely successful in Europe, and if it wasn't for Commodore's blunders (marketing/product wise), it would have lived much longer.
There was no other computer platform that could compare in terms of graphics and sound capabilities at the time. It was so many years ahead it's not even funny comparing it to Macs, Ataris or PCs of the era.
Funny thing is I absolutely love Microsoft's 2017 vision:
Actually that only means that Apple and other internet players will actually do the real work to make that vision a reality. Microsoft is marketers, not innovators, every since Bill Gates left as CEO.
Bill Gates final year as CEO he gave the Codex keynote presentation that described and outlined exactly to the letter, the iPhone, Amazon Cloud, Facebook, Twitter, and Google. All in a time when Windows CE and Windows embedded were their most innovative achievement. Rather than pushing to develop a touch screen hand held, they continued to develop an ugly version of something that looked more like the Blackberry than the Palm OS handheld devices. By the time Apple and Google came out with Android and the iPhone, Microsoft released their biggest Microsoft phone OS flop ever, still with chicklet keys and no touch screen.
There is no argument. Real pre-emptive multitasking albeit without memory protection. The OS was super lean and efficient.
Agreed, the Amiga was lightyears ahead. However I think the reason it failed was that it wasn't standardized and not easy to amend/extend components.
What matters is how well the system works as a whole, and what people like you with bullet-list buzzword mentalities fail to understand is that this holistic nature of systems means 'good enough' across the board will always beat a system with 'perfect' implementation in all but one critical area area of failure, just like two pair in the hand will beat a busted straight flush draw.
Actually, I think you missed my point about Apple and Steve Jobs.
I'm not saying that you or anyone else should prefer one platform over the other. It's a religious matter. Nor am I saying that Apple is a bad company.
What I am objecting to is the elevation of Steve Jobs to a god-like status. His followers portray him as the great innovator whose ideas are so far ahead of his time that we dummies aren't even capable of understanding them. This, of course, is utter bullshit. Steve Jobs never had an original idea. Apple never invented anything. Everything that Jobs and Apples is known for was conceived and implemented by someone else before.
Now, if you want to make the point that it's ok to take other people's ideas and try to refine them, then fine, I'd agree. I think that IT laws and patents in this country are truly fucked up. Of course, it's hypocritical of Apple to sue other companies for "stealing" ideas that Apple stole from others, but that's another story.
However, if you are going to take the stance that "Jobs stood on the shoulders of giants", then don't elevate Jobs to the level of a god or great innovator. He was a pirate, same as most CEOs. He used people like condoms and then discarded them when they have fulfilled their purpose. He screwed over others on the way to the top. He's no different than Bill Gates. And he never, ever innovated.
The true innovators are the unknown people who worked at other companies, whose ideas were copied by Apple. Great people like Engelbart. The most perverted thing about capitalism, is that successful parasites get the credit earned by unknown grunts, who are treated like shit by the economic system.
If you want to say that Jobs was a great entrepreneur because he plunder better than his competitors, then fine; that's capitalism. But don't say that Jobs was a great entrepreneur because he was an innovator. He never had an original idea. He never built a single product. He just cracked the whip to force other, hard-working people to copy ideas from other, innovative people.
if you think Apple copied the Rio or Nomad, you simply have no clue how tech really works.
Neither did Commander Taco, back in the day, so you're certainly not alone.
Apple certainly "stole" from them, LOL. Showed them how to actually make a usable product, that's what.
The Diamond Rio was the first hardware MP3 player, and it did work, damn well. Yes, the iPod was a copy of this product.
Hell, the two products look almost the same except that the Rio is clearly the predecessor. Of course, the Rio came out about three years before the first iPod and during that time small LCDs became more advanced and much cheaper and other hand-held hardware also became smaller, faster, and cheaper. By the way, Apple was not at all responsible for these advancements as they were made by hardware manufacturers.
And for someone who knows nothing about technology, I sure make a decent living building new technology.
Some of the least tech-savvy people I know are moving to tablet-type OS's, either iOS or Android, to do everything.
I have this Netbook that I bought about four years ago and it was sitting around because with Windows 7 starter running on an Atom CPU it was so slow that it became unusable and frustrating. One wrong click and it would go spinning off into neverland.
However, I found this article about running Google Chrome OS on a netbook:
http://lifehacker.com/5820358/how-to-turn-your-netbook-into-a-chromebook-with-chromium-os
So I downloaded the Hexxeh build and tried it. I had some issues running off a USB so I put it on a SDHC card instead. The difference is night and day. Now it's fast and usable as a Chromebook. I have a few issues with Power Management, but I haven't really explored the controls.
The best thing is that this machine, which used to take something like 5 minutes to turn on, what with all the "settling down" that Windows does, now takes a few seconds, more like a smartphone or tablet. I did have to add in Flash support through a script (many distro builders don't like to mess with 3rd party media because of licensing) and it runs video, and streaming audio perfectly.
The Diamond Rio was the first hardware MP3 player, and it did work, damn well.
I bought one of those...from an E-Bay type site called U-Bid!
I used it all the time waiting for the bus...but this was about a year before the iPod (maybe around 2002).
What I used to do is listen to an online streaming station, like 3wk.com I bought this program for Windows that would record anything that was being played on my PC....but, it also had this feature that let you cut up a sound file based on silence gaps. So I would play music for an hour, save the file, and then snip it up into "Tunes" and then copy them on to my Rio.
I still use Windows XP at home. If I want to use something more modern, I boot to my Fedora 19 drive.
But there is no way Windows is going to lose 90% of its share in the server realm over even the next 5 years. There's too much time, effort, and money invested in Windows platforms as the hosts of websites, web services, and other server-based technology.
I disagree.
I've witnessed entire datacenters turned over from Solaris/Windows physical to Linux-based VM in 3 years. All it takes is the PHB going to a bunch of meetings and deciding it's time to "invest" in the hot new things. They promptly and deliberately throw out a bunch of working equipment and service setups and make up some numbers to justify savings down the road.
http://news.yahoo.com/windows-7-still-growing-faster-windows-8-230049436.html
Not surprising. All the people finally moving away from Windows XP because of the need for more than 3.5 GB of system memory, Direct X 10 or later, Visual Studio 2012 or later, volumes larger than 2 TB, etc. are moving to Windows 7, not 8. And few people are moving from 7 to 8.
Will Microsoft finally get the message that they are going in the wrong direction and need to backtrack? Will Microsoft finally start listening to their very vocal and frustrated users? Or will Microsoft finally kill Windows, not intentionally, but through their own arrogance in refusing to listen to the very people buying Windows for the past 30 years? Yeah, 30 years believe it or not.