« First « Previous Comments 22 - 61 of 61 Search these comments
it was just an effort to port Linux on the PowerPC computers
during this period Apple engineers, using the term loosely, were monkeying around with various approaches, like maybe even using NT's PPC kernel as the OS with "MacOS" API on top of that.
It was actually at least 10x faster than it was.
Actually there was a bug that they never fixed that forced the slower speed.
http://www.lemon64.com/forum/viewtopic.php?t=52439&sid=660bd9bc1f311f2aa17f8504670ff8dd
One look at my friends having to type Load * 8,1,1 or whatever put me off of the C-64 permanently.
The funny thing is we use SATA and USB now, both serial interfaces. Commodore was ahead of its time!
Apple never used Linux. They used BSD, an other version of Unix.
I stand corrected.
In either case though, it remains one of the best decisions Apple ever made as continuing MAC OS would have been disastrous. Furthermore, by converting to Unix, Apple by luck of timing was prepared to enter the emerging MP3 player and smartphone player market, which they wouldn't have been if they stuck with MAC OS.
I also stand by the statement that Linux was largely responsible for the acceptance of Unix in general on small portable devices including smartphones. Getting an operating system for free to use in your commercial product is a tremendous financial motivation and that lead to the recent revival of Unix on everything from smartphones to servers.
I don't think I was implying that
"Apple decided to use Linux as the operating systems of all its products"
I was wrong about MAC OS being based on Linux. I should have said BSD Unix. Yes, even my memory is imperfect. One confusion over the tons of Unix forks is understandable.
However, I still don't think I was implying that Linus Torvalds invented Unix, which is what I thought you meant by
your big misapprehension is thinking Linus invented Unix. His stuff was a side-show in the 90s.
Linux was first announced in 1991, a year after TBL started working on his www idea on NeXT
I don't think the NeXT team got a time machine and grabbed Linus' stuff then went back to the the 80s
I would be extremely shocked if Apple, a company with a very long and deep history of copying other people's intellectual property, would not take entire chunks of functionality from the range of Linux distros whenever it was cheaper, faster, and easier to incorporate some useful functionality by doing so.
In any case, my point was that by making a royalty-free, open source version of Unix, Linus Torvalds created a huge market space that resulted in smartphones per-dominantly running Unix today, which opened the opportunity for the iPhone and huge profits for Apple. I'd even go as far as to say the iPod and other small devices are largely influence by the creation of Linux.
Pure conjecture. So much hatred? Lucky for me I don't have any emotional baggage to hate Apple, switch from a Windows user to a Mac user in 1997 and buy AAPL for many years. Ahem... lucky choice or willing to change views when the tide change, up to you to opine.
Pure conjecture.
No. Simply empirical facts.
So much hatred?
Like most Americans, you make the false assumption that everybody either has to be on your team or the polar opposite. Life isn't like that.
Both Apple and Microsoft, and quite a few other players in the 1980s, have stolen a lot of ideas from each other. How is pointing out a historical fact hatred? If you state that America practiced slavery for nearly 100 years, does that mean you hate America? Is whitewashing history a form of patriotism? Why should I whitewash the history of computing?
Lucky for me I don't have any emotional baggage to hate Apple
One cannot love or hate corporations, countries, or other imaginary designations based on pieces of paper. You cheapen the words love and hate by using them to describe such trivial things.
Furthermore, you entirely missed the point of this thread, which is to inform, not persuade.
If you're informing,why so many ad hominem attacks? And pure conjecture? And so many opinionated assertions?
From comment 26, examples of emotional charged opinions are: "extremely shocked", and "even go as far as to say".
continuing MAC OS would have been disastrous. Furthermore, by converting to Unix, Apple by luck of timing was prepared to enter the emerging MP3 player
you're kind talking out of school here . . . whatever embedded OS the PortalPlayer chipset used on the first iPods, it wasn't running Unix!
NeXT/Apple's winning approach was a) Tevanian's Mach kernel, which served as the HAL + b) 4.3BSD API for processes etc and CLI tooling + c) OO Objective-C APIs to abstract all the C nastiness going on at the OS layer + d) visual design tools to put together apps from UI components (buttons, menus, textviews. etc) faster.
NeXT's kernel was called XNU, which stands for . . . Xnu's not Unix!
acceptance of Unix in general on small portable devices including smartphones.
nah, Sharp released the Zaurus in 2002, based on Unix, and that didn't go anywhere.
Unix is an implementation detail with Android. If they had chosen QNX or VxWorks instead they'd have been equally as successful.
VxWorks actually performed an interplanetary patch operation back in 1997, that was impressive.
Linux, and Unix in general is just like McDonald's, Walmart, or Microsoft.
They got big for real reasons, but these reasons are not necessarily 100% goodness and light, sometimes just contingency, accident, and network effects.
Unix got a good start with C, BSD freeing the code from AT&T's control, the GNU project, and UNIX's mindset of a lot of little pieces working together sequentially is better than hefty things working alone.
It's just code, all the way down, man.
ad hominem attacks
An ad hominem attack, by definition, can only be levied against a person, not a company. But if you meant expression of negative opinions... Well, in the entire original post, I expressed only three negative opinions about Apple products.
1. OS 9 was a piece of shit.
Most Apple users agree with this and wetted their pants with excitement over OS X. But even in that opinion, my point was that Apple was highly motivated to abandon an operating system they simply could not continue to maintain. It was time for a break.
2. The Apple I and Apple ][ and Lisa were armature hobbyist quality.
I stand by that opinion. You are free to disagree with it, but I see no reason to inject a little bit of opinion when making a long talk about the sometimes beautiful, sometimes ugly, always interesting history of the rise of modern computing.
3. Apple's imitation of Douglas Engelbart's work was not as good as Engelbart's.
OK, you could take this as a negative judgement on Apple or as respect for the great Douglas Engelbart. I've often said that Microsoft's development model is to make a shitty copy of an existing product and keep refining it until it becomes good five or more years later.
So at most three negative opinions expressed at Apple in the original post. That hardly constitutes Apple bashing, especially given that these opinions are well-founded. Only a fanboy holds the position that no criticism of an entity is acceptable. Are you a fanboy?
From comment 26, examples of emotional charged opinions are: "extremely shocked", and "even go as far as to say".
However, Linux was open source from the beginning. People were encouraged to copy, share, and fork the code. I would be extremely shocked if Apple, a company with a very long and deep history of copying other people's intellectual property, would not take entire chunks of functionality from the range of Linux distros whenever it was cheaper, faster, and easier to incorporate some useful functionality by doing so.
How is that an attack on Apple. Are you saying that no developer at Apple ever looked up code on the Internet, said "hmmm, that solves the problem I'm working on", and then incorporated the code with or without modifications? Unless Apple had the very dangerous "not invented here" syndrome, it would be foolish to re-invent the wheel. It is also astronomically improvable that Apple, working on its own version of Unix, would not leverage the open-source Linux in any way. In fact, it would be damn stupid if they didn't.
I'd even go as far as to say the iPod and other small devices are largely influence by the creation of Linux.
Again, how is this an attack of any sort, nonetheless an unfounded attack? Do you have such a negative opinion of Linux that the mere implication that it has influenced a product is some form of denigration?
Again, you are missing the point of this entire thread. Although I have sprinkled a few opinions in my recalling of history, as any human being would (that's a large part of what makes us human and our history interesting), my accounting is almost entirely factual and informative exposition without judgement. And in the few places where I expressed judgement, the purpose was to illustrate the motivation of the players to change their products such as Apple wanting to move to a different, proven operating system.
Do you really think I would never say anything bad about Microsoft? Obviously you haven't paid attention to my previous tech posts. I have no problem praising and condemning the same company for its good work and its bad. I don't engage in the bullshit arbitrary cultural wars that dominate most people's thinking. In fact, the entire purpose of this thread was to steer people away from such cultural wars by giving them a better and correct understanding of what operating systems and related concepts are and how they work.
you're kind talking out of school here . . . whatever embedded OS the PortalPlayer chipset used on the first iPods, it wasn't running Unix!
Once again you are reading incorrect things into my words. In the statement
I'd even go as far as to say the iPod and other small devices are largely influence by the creation of Linux.
I was discussing the influence of Linux on the emerging portable computing market, not describing individual product implementation.
Forget about technology for a moment and think of economics. Without Linux, companies developing MP3 players, phones, and tablets would not have a clear OS to choose for their product. The market would be fragmented. But because of the popularity and royalty-free licensing Linux brought to Unix in general, hardware manufacturers could build their hardware to run Unix. Yes, there are lots of versions of Unix, but they are all closely related and have a lot of overlap. Hardware manufacturers could confidently build new hardware without the worry that it would not be compatible with the predominate operating system. This created the opportunity for rapid advancement in what kinds of things small portable devices could do such as handle GPS, know it's orientation, have cameras, proximity detectors, etc.
In this way, Linux influenced the entire portable computing market even for non-Unix based devices. Linux made Unix far more popular.
nah, Sharp released the Zaurus in 2002, based on Unix, and that didn't go anywhere.
In every period of great advancement, there are tens, hundreds, even thousands of losers for every winner. Pets.com didn't go anywhere, but that doesn't mean the whole Internet thing was a bust.
It's just code, all the way down, man.
Of course everything's code all the way down to the hardware. But code matters. Code can be of excellent or terrible quality. And it's not just the code itself. It's the architecture, the design, and the concepts behind the code, not just the expressions written in some programming language.
However, I don't want this threat to turn into an argument over which OS is best. We can do that in another thread. I intended this thread to correct the predominate misunderstanding that the pixels on the screen is the OS, that the desktop shell is the operating system rather than simply a replaceable interface to the OS.
As for continuing with MacOS in the 1990s being a disaster, I dunno.
MacOS was actually aggressively modernized 1997-2000 while the NeXT people rewrote and rearchitected OS X for its debut.
OS X was only the default boot OS in 2002, after 5+ years of development at Apple.
And in the end the compatibility story was the "BlueBox" which ran pre-existing "MacOS" binaries in a virtual machine, with decent interop with the OS X windowing manager to hide the hack.
Carbon:
http://en.wikipedia.org/wiki/Carbon_%28API%29
the modernized MacOS C API that supported PMT, and ran as a first-class citizen on OS X (e.g. Photoshop, Excel, the Finder itself, etc) only became deprecated in 2007.
I had the opportunity to see the sausage as it was being made in the middle of this process, and I don't see Unix's brilliance here.
gcc was abandoned as crap, X11 was never adopted, POSIX could have been provided like how Microsoft wrote WinSock for NT.
Apple would be irrelevant now but for the iPod and iPhone, and the success of these didn't have anything to do with Unix.
The iPod succeeded because Apple cleverly bought all the 1.8" hard drives Toshiba could manufacture, enabling Apple to corner the market on gigabyte PMPs that could fit in a pocket (until flash got cheap enough). That and Apple doing the heavy lifting getting the iTMS off the ground in 2003, at $1 per song.
The iPhone succeeded for several reasons. Foremost because the telcos were shipping truly horrible product and Nokia just couldn't quite put all the right pieces together.
They had OMAP, which was equivalent to iPhone 1's hardware, but didn't have the API design or the GL drivers working well, while the iPhone had a fully performant PowerVR renderer doing all the graphics updates, along with the CoreAnimation API that made it possible for programmers to make the dynamic, animated UE that users loved.
iPhone's biggest wins were losing the stylus, losing extraneous phone-related buttons, cursor keys, and all that crap, which enabled them to go QVGA at 160DPI, good enough to support a real web browser on the device.
Without Linux, companies developing MP3 players, phones, and tablets would not have a clear OS to choose for their product.
Why the #*%) does a mp3 player need Unix???
The NeXT people running Apple in 2000-2001 had been elbows-deep in Unix since 1986 with NeXT and earlier in school etc.
They didn't need Unix for the iPod, they found the chipset they needed with PortalPlayer, iterated on the final design and UI for some number of months, and shipped the product forthwith.
Why the #*%) does a mp3 player need Unix???
Why does a phone need to? It doesn't. But eventually a product is expanded and does more things. First it plays back MP3s, then it records voice, then it plays short videos as well, then it takes pictures, then videos, then it sends email...
The first hardware MP3 player was the Diamond Rio, which came out in 1998. Today I play music on my smartphone, which really shouldn't even be called a phone. It's really a hand-held computer that has telephony services as one of its many functions. There's a big difference between the Rio and a modern smartphone.
One can even argue that the only hand-held device you need today is what we call a smartphone. Why carry a separate music player, a separate voice recorder, a separate digital camera, a separate portable gaming console, or even a separate watch? I stopped wearing a watch years ago. Personally, I rather carry one device than several, and the smartphone is a great platform for doing all those activities. Of course, with that complexity, you need an operating system. I'd argue that having a platform, particularly a managed one, as well is a very good idea.
As for continuing with MacOS in the 1990s being a disaster, I dunno.
MacOS was actually aggressively modernized 1997-2000 while the NeXT people rewrote and rearchitected OS X for its debut.
Ah, but my point is that even doing that ends MAC OS and starts a new operating system, even if it is still marketed under the same name. It's an entirely different beast.
even if it is still marketed under the same name. It's an entirely different beast.
Why does a phone need to? It doesn't.
Actually, a "smart" phone needs "powerful" application frameworks to construct the UX of the apps the user is using on the phone.
Application frameworks don't exist in a vacuum, as a platform they have technical underpinnings that exercise lower API levels like POSIX or MS-DOS.
The first iPhone had a 400MHz 32-bit RISC ARM processor (gee, wonder why they didn't choose x86), 128MB of RAM, and 8GB+ of flash storage, plus PowerVR rendering engine.
This was roughly equivalent to the iMacs Apple was shipping in the late 90s, which was also conveniently the minimum spec of what OS X was supposed to support on its first iterations.
This ARM hardware (essentially OMAP) had no proper well-developed OS (just look at the crap Nokia shipped running on it) in 2005, so Apple could not buy an off-the-shelf OS solution, they had to develop it in-house.
With Mach as the HAL, they could get XNU up, and once XNU was up they could look at what existing desktop APIs they needed to get working for the apps they wanted to run on the phone.
Curiously, they did take the opportunity to redesign a lot of the existing AppKit framework, so much that they shipped it named "UIKit" instead. But most of the existing OS X platform came over, they didn't have the time or risk tolerance to reinvent any wheels here.
You guys are loads of fun. Do iOS and OS X run on the same operating system? Discuss.
iOS and OS X are both variants of Unix with a graphical shell on top. Are any two customized versions of UNIX the same operating system? Is Lutheran and Catholicism the same religion because they are both Christianity?
Like I said in the original post, Unix is best thought of today as a family of operating systems. There is no clear line that distinguishes one species from another. I would consider iOS to be materially different from OS X as it's customize to run on phones.
Now using PCs as the basis for their products was a great improvement. Intel released the MMX and later XXM instruction set, which are CPU instructions that deal with massively parallel arithmetic operations designed to support multimedia.
"their products" == apple products.
You make it sound like Apple switched to x86 BEFORE Intel came out with MMX. That is not correct. MMX was introduced in 1997. Apple announced their first Intel/x86-based machines in Jan. 2006. And there is no such thing as "XXM". I think you mean SSE.
I enjoyed your history of personal computing and operating system, and it is largely correct, but some of the details are not quite right. Another case in point is what you say about PowerPC.
First, the Intel chips were always clocked faster than the PowerPC chips, completely defeating the purpose of the entire RISC architecture.
Not true at all. The first several generations of PowerPC in the 1980s-1990s were far ahead of Intel, performance-wise and at times also clock-wise. Besides, the point of RISC is not necessarily to achieve the highest clock frequency. It is to achieve better throughput by simplifying the instruction pipeline, often making it shorter than in a CISC implementation. In other words, Clocks per instruction (CPI) also matters. This is why AMD was creaming Intel on performance in 1999 or so. Intel was excessively pipelining their circuits to drive up the clock frequency (good for marketing), but it did not pay off on the CPI measure.
Back to PowerPC: PowerPC existed long before (and after) it became Apple's replacement for 68k procesors from Motorola. Pricing was not good, though. Around year 2003, Intel started pulling away because their wafer technology (device technology, lithography, etc) became better cost/performance than IBM and Motorola. That is one of the main reasons that Apple made the switch to Intel in 2005-2006.
Apple got everything for free.
IOW, "good artists copy, great artists steal." If you want to become the most valuable corporation in the world, then instead of trying to build a new OS from scratch, take a free one and make it your own. From a customer POV, Apple has the best OS. I say this as a person who prefers more features, even though they introduce more complexity: Apple curates only the most valuable features and qualities, because Apple has taste. I like my non-Apple products, but Apple customers love Apple products, and that makes a huge difference in quality of life.
That word "curate" seems to be trending up in all kinds of contexts recently. Is it now a weasel-word for theft, copying and plagiarism?
Apple got everything for free.
IOW, "good artists copy, great artists steal." If you want to become the most valuable corporation in the world, then instead of trying to build a new OS from scratch, take a free one and make it your own. From a customer POV, Apple has the best OS. I say this as a person who prefers more features, even though they introduce more complexity: Apple curates only the most valuable features and qualities, because Apple has taste. I like my non-Apple products, but Apple customers love Apple products, and that makes a huge difference in quality of life.
They had heated debates between on which was better - iPhone or Blackberry. I don't hear that anymore, as Blackberry lies in ruins. The same with Samsung now.
It's the consumer that gets to pick the winners, no one else.
You make it sound like Apple switched to x86 BEFORE Intel came out with MMX.
I did not state or imply any such thing. I have no idea where you got that impression.
And there is no such thing as "XXM"
Back in the 1990s it was referred to as XXM just like the registers to distinguish from the original MMX set. Perhaps a misnomer that's been long corrected. It's been over 15 years since I did assembly language programming and I haven't kept up with the terms. The literature at the time did use the term XXM.
The first several generations of PowerPC in the 1980s-1990s were far ahead of Intel, performance-wise and at times also clock-wise.
PowerPCs in the 1980s? The first PowerPCs were introduced to the public in late 1993. As for clock speeds, both sides leapfrogged each other for a while, but the PowerPC topped out at about 3 GHz and x86 topped out at about 4 GHz -- well, recently it reached 5 GHz, but the point is that the great CPU clock speed increases are a thing of the past. So this invalidates the RISC approach. Instead, if you want a faster computer, you increase the CPU and the core count and do parallel processing.
Why Apple Really Ditched PowerPC
But then things slowly started turning dark. At MacWorld '03, instead of delivering his promised 3 GHz systems, Steve Jobs's announcement topped out at 2.5 GHz, and the G5 was getting nowhere near the PowerBook line, running too hot and drawing far too much power. The G5 was failing to meet its expectations.
At MacWorld '05, Apple once again upgraded its Power Mac line, this time pushing the chip just 200 MHz faster than it had been a year ago, the elusive 3 GHz still nowhere in sight two years after it was promised. The G5 was still nowhere near getting into the PowerBook, and vague rumors of a new PowerPC core for portables cropped up. The G4 had practically stalled out at this point and the G5 was looking more like a stop-gap and less like a real solution to Apple's chip problems.
Apple had stuck to offering dual processors to keep competitive with Intel and AMD; now, as this tactic was running out of steam, IBM released dual-core versions of the G5 in order to offer Apple the benefits of symmetric multi-processing without the engineering overhead for a second chip. The result was the PowerPC 970MP, which per core ran slower than previous versions but, as a consolation, offered twice the cache. At this point, it became evident that IBM had seriously failed to meet its goals for the G5.
Apple was now in the same position it had been in five years earlier, with a processor that was falling behind and needing a new chip supplier desperately. Since moving away from the G4, Motorola had spun its chip division off into Freescale, with a G4 whose clock hadn't budged in years and shadowy promises of 64-bit dual-core chips that weren't even ready for sampling. Apple's portable line had been hit the hardest, standing still with the G4 while Pentium derivatives cleaned house. Apple was in dire straits.
The PowerPC was a worthy experiment to run, but Intel's x86 architecture won out. And with clock speed no longer being the determining factor in computer speed, the use of reduced instruction sets simply does not make sense.
If you look at the history of IT from 1980 to today, it's all about copying other people's ideas. All the big players did it. And it's still going on. Remember when Bing used Google search results? It's a shame that a lot of small to mid-size companies got fucked over in the process though.
Today the business model seems to be, start a small business and hope to get bought out by a large one.
It's the consumer that gets to pick the winners, no one else.
And they picked the x86 PC over both the MAC Motorola and the PowerPC.
It's the consumer that gets to pick the winners, no one else.
And they picked the x86 PC over both the MAC Motorola and the PowerPC.
The consumer was not allowed to pick winners at that time. Microsoft was, and to a certain extent still is a Monopoly.
The consumer was not allowed to pick winners at that time. Microsoft was, and to a certain extent still is a Monopoly.
Honey, you either believe in the free market or you don't. The consumer never has unfettered choice.
And Microsoft is not Intel. Microsoft doesn't make x86 CPUs.
The PowerPC was a worthy experiment to run, but Intel's x86 architecture won out. And with clock speed no longer being the determining factor in computer speed, the use of reduced instruction sets simply does not make sense.
I see you just refuse to change position. As pointed out by Justme, Intel won because of wafer technology and NOT CISC is better than RISC. RISC is now used in all iOS devices. If I'm intel, would be worried because what if Apple uses ARM RISC in Mac!!! Wonder which processor architecture, RISC or CISC, is most popular for Android?
It's the consumer that gets to pick the winners, no one else.
And they picked the x86 PC over both the MAC Motorola and the PowerPC.
Living in the past again.
Future OS concepts - You talked like BG. Everything critical belongs to OS.
what if Apple uses ARM RISC in Mac
As long as Intel is a node or two ahead of everyone, this isn't going to happen. Plus this is a Windows world still, and being compatible here has strategic advantages, e.g. the orders Apple is getting from IBM for laptops.
And with clock speed no longer being the determining factor in computer speed, the use of reduced instruction sets simply does not make sense.
yeah, well, there's no magic that made x86 ISA better. Starting from scratch nobody in their right mind would end up with that.
(Just like all intel efforts I must add, from USB HID to PC DSDTs)
x86 has the singular advantage of being the legacy ISA that is compatible with 30+ years of industry development (only relatively recently did Linux ditch compatibility with 386), and of course the binary ISA of the platform "OS" that 90%+ of the planet uses
Apple didn't need x86 compatibility last decade when working on the iPhone, so they didn't go to Intel with an order for x86 parts.
You know what ARM stands for?
The funny thing though is that Intel solved its x86 problem by just using it as an on-disk IL/bytecode target that got reworked into a proper RISC-ish ISA before it hit the execution units.
This is the best way to go, take a sane CISC like 68k ColdFire ISA and use that as your bytecode, and let each CPU optimize it, since compilers sure as hell can't optimize for all the different variations of CPUs the binary they're creating might encounter eventually.
So the whole PPC thing was largely unnecessary, Apple could have stuck with 68K right until now, if AIM could have kept up with INTC at the fab level.
Unfortunately, Apple's success with iOS came too late to save the Mac from Intel's embrace, though it must be said being compatible with the rest of the world has its benefits, once x86 outgrew so many of its horrible flaws.
Another funny thing is that I'm not aware of any x86 arcade games of the 80s or 90s, at least not until 3Dfx and DirectX were established on the PC platform, which gave arcade makers time-to-market advantage and hw cost savings going that way.
6809 and 68000 were standard. The one arcade game I had the privilege of semi-contributing to in the 90s used a 68000 for the logic and graphics display and a TMS34010 for the 2D graphics overlay.

shows Intel's penetration of the CPU design of arcade games.
yeah, my fiery hatred of x86 has only been slightly moderated by having to use the damn things, 2005 - now.
Luckily I work so far up the stack the ISA is completely irrelevant. I actually haven't even had to look at assembly for over 20 years.
Some seem to forget the Linux-based Android is, far and away, the dominant mobile OS by market share, especially beyond the USA.
It wasn’t long ago that Android overtook iOS as the most popular mobile platform in the United States, but as the release of Apple’s first large screen phone approaches, Android’s lead is beginning to look insurmountable. According to the latest data from Kantar Worldpanel ComTech, Android now holds 61.9% of the U.S. market share to Apple’s 32.5%, the lowest percentage iOS has captured since the iPhone 4s launched in 2011.
Internationally, Android is even more ubiquitous, with 82.7% market share in China and 73.3% across Europe in countries including the UK, France, Germany Spain and Italy. That’s not to say Apple isn’t showing signs of life — ComTech also notes that the iPhone 5s and iPhone 5c were still the top selling phones in Britain during the month of May, even following a hugely successful launch for the Galaxy S5. On the other hand, 17% of customers who purchased a Galaxy S5 were switching from iOS to Android.
States should pass a law requiring State Universities to make a fiscal case for staying in the expensive Apple Ecosystem. R and R Studio don't run on Linux? 4D is a good, modern database? What is the costs between a Linux Built and a Proprietary Mac Desktop/Laptop with the similar Memory, CPU, Graphics, etc. components (ho,ho, ho)? The University servers don't already run on Unix?
It's the consumer that gets to pick the winners, no one else.
And they picked the x86 PC over both the MAC Motorola and the PowerPC.
What are you kidding me? Apple was never more than a nitch player in the computer market. Apple switching to the x86 affected a whopping 3-4% of the market tops. Not exactly a consumer market stampede. Consumers picked the windows market and didn't give a shit in any way shape or from what the chip was. If IBM had picked the 68000, which they might have, then the entire pc market would be motorola right now and almost no one would know the difference. The 68000 peripheral chipset wasn't going to be ready in time for the original pc release date by at least 6 months so it just wasn't a consideration. IBM might have gone 8088 no matter what, they used the 8085 in the datamaster and were familiar with it, but if there were fully functional 68000 to compare to it would have been a tough call. On the other hand I've read several articles that said the 8088 8 bit was chosen because a 16 bit chip 8086 or 68000 would have generated serious political opposition from the minicomputer division.
Why are you so fixated on apple?
Is the FED behind this?
1995 called. They want their heated online platform discussion back.
Jesus, I'm *am* hitting my 20th anniversary for arguing on the internet just around now.
: (
Intel won because of wafer technology and NOT CISC is better than RISC.
Given that the industry has hit a wall in clock speeds, what makes you think that a reduced instruction set is better than a rich one?
Why are you so fixated on apple?
I'm not. I covered quite a bit. But when you only have two brands left, Coke and Pepsi, it's hard not to mention one.
And they picked the x86 PC over both the MAC Motorola and the PowerPC.
What are you kidding me? Apple was never more than a nitch player in the computer market. Apple switching to the x86 affected a whopping 3-4% of the market tops.
Doesn't that prove my point that the consumer based chose the x86 line over several alternatives?
Intel won because of wafer technology and NOT CISC is better than RISC.
Given that the industry has hit a wall in clock speeds, what makes you think that a reduced instruction set is better than a rich one?
Hey, I didn't say that :wink:.
And they picked the x86 PC over both the MAC Motorola and the PowerPC.
What are you kidding me? Apple was never more than a nitch player in the computer market. Apple switching to the x86 affected a whopping 3-4% of the market tops.
Doesn't that prove my point that the consumer based chose the x86 line over several alternatives?
Not at all, consumers chose MS/DOS. If IBM had gone with the 68000 and apple with the 8088 the pc world would have been all motorola. The vast majority of pc's were sold to business, especially in the first 5 years. Business managers wanted the magic phrase IBM which also meant MS/DOS. Once MS/DOS was established in the business world, and for business people wanting a machine for home apples was relegated to a couple nitch markets. No one ever cared what chip was inside other than you and intel.
I, and lots of other people, never bought jobs official reasons for switching chips. Amd had faster chips than intel. Gamers loved amd. I think intel made jobs a killer deal on price to knock ibm/motorola out. Consumers had zero to do with it.
« First « Previous Comments 22 - 61 of 61 Search these comments
There has for some time been much confusion about what an operating system is and what makes a good one. So I've taken the time to offer a bit of clarity.
Operating System
An operating system is the software that operates the system. It's a self-defining term. An operating system does a lot of important stuff that most people are completely ignorant of and don't give a damn about except that those things affect whether or not the computer works and how fast it is. The system that the OS operates is the hardware and many OS entities that support running applications and other software on your computer. Such entities include processes, threads, file handles, network stacks, memory blocks, and so forth.
The operating system essentially does two things. It handles the operating the hardware and managing resources. This is why it's called an operating system. The second, and equally important, thing that an OS does is act as an Application Programming Interface or API for all other software so that the other software can run and interact with the hardware as well as each other. In Windows, there used to be an API called Win16. That's now retired. Today Windows offers two distinct APIs called Win32 and Win64, which are 32-bit and 64-bit versions.
Windows 3.1 and earlier were not operating systems, but rather graphical shells that ran on top of the operating system called DOS. OS X and iOS also are NOT operating systems but rather platforms -- I'll define that soon -- that run on top of the operating system. The last operating system that Apple wrote was OS 9 in 1999, which was such a flaming piece of shit that Apple decide to stop even trying to write OSes. This was in fact one of the best decisions Apple ever made, and I mean for Apple. The other best decision Apple ever made was to stop making hardware and simply use PCs.
PC or MAC
The term Personal Computer or PC originally meant any computer an individual could earn. It had to be cheap and small compared to the mainframes and minicomputers. A minicomputer was barely the size of a refrigerator. In the 1970s, this was considered small. PCs were any microcomputer. Today we don't use the term microcomputer, but rather desktop and tower. Later came the laptop, tablet, and phone form factors.
By 1984 the term PC did not mean any computer owned by an individual because the entire microcomputer industry was boiled down to two players, Intel and Apple. PC then meant any computer based on the Intel x86 line such as the 8086, 80286, 80386, 80486, 80586 or Pentium I, etc. Intel stopped using the 80x86 names because it couldn't trademark them due to a court ruling that they were descriptions. So, in order to market against AMD, another company that made x86 CPUs, Intel trademarked "Pentium" and started calling their chips by that name, which is why today it's far more confusing which chips precede which.
On the Apple side, Apple made computers like the Apple I and Apple ][ and eventually the Lisa, which quite frankly were armature hobbyist quality. These were the only true Apple computers in the fullest sense. This all changed when the executives at Xerox corporation, the makers of photocopiers, made one of the stupidest mistakes in history. They showed Steve Jobs the work of the great Douglas Engelbart.
Douglas Engelbart invented the mouse, the graphical user interface, and Ethernet. His inventions were decades before their time. Steve Jobs saw these inventions and, as was common in the early days of computing, ripped them off wholesale. And the Apple equivalent of the graphical user interface was a piece of shit compared to Engelbart's already completed version.
So Apple went on to build the Macintosh, which used the Motorola architecture. Motorola is a company with a long history of embedded computing (think electronic security doors, building automation, cell phones, processors in various parts of automobiles, etc.). Specifically all true Macintoshes, i.e. before the PowerPC, used Motorola 680x0 chips from 68000 to 68040.
However, Apple wasn't able to compete well against the Windows/PC (or Microsoft/Intel) alliance using the Motorola chips. The Intel chips were just better for desktop computing. Not that the Motorola weren't great for embedded devices, but Intel offered faster machines that could do more.
So Apple and IBM got together and tried a new idea, Reduced Instruction Set Code or RISC. The idea was to eliminate most operations a CPU could do, so that the CPU's clock speed could be drastically increase. The CPU would have to execute more instructions to do what an Intel chip could do with one complex instruction, but the CPU hopefully could do each simple instruction much faster. However, RISC turned out to be a flop. First, the Intel chips were always clocked faster than the PowerPC chips, completely defeating the purpose of the entire RISC architecture. Second, the complex instructions were just quite frankly awesome and made a lot of things like playing and editing video possible.
So Apple abandoned the PowerPC, which despite its deceptive marketing name was not a PC, and simply moved to the PC platform. Remember, the term PC and the term Windows are completely different. Windows is an operating system that runs on PCs and other devices (phones), and PC is a computer architecture that can run a multitude of operating systems including Windows, OS/2, OS X, Solaris, Linux, etc.
Now using PCs as the basis for their products was a great improvement. Intel released the MMX and later XXM instruction set, which are CPU instructions that deal with massively parallel arithmetic operations designed to support multimedia. These instructions are beyond awesome. Unless you've written multimedia code in C, ported to assembly to double its speed, and then wrote XXM code to increase the speed by two to three orders of decimal magnitude, you don't comprehend how awesome these instructions are. I do not exaggerate when I say that without these instructions, video playback and certainly video editing would be impossible on computers. And Apple made the transitions to PCs right when multimedia was taking off. If it hadn't, the entire Apple experience would have ended.
Today, the only computers that you use that are not PCs are your tablets and phones. Well, some tablets are PCs, but others like the Google Nexus 10 are not. Intel has recently entered the smartphone market, but I don't know if their chips are following the x86 architecture or not, and Intel has a lot of catching up to do. I'm sure the executives at Intel are kicking themselves, or more likely their engineers, for missing the smartphone boat ten years ago.
Hopefully at this point you have a firm grasp of what an operating system is and what a computer platform is, and what the differences between the two are. There is a many to many relationship between OSes and computer platforms. A single platform like the PC can run many operating systems and can even have dual or multi-boot to different installed operating systems. Similarly, any operating system can be ported to multiple computer platforms. Typically operating systems have some kind of hardware abstraction layer or HAL to make this easier.
Software Platforms
So if OS X and iOS are not operating systems, despite their deliberately deceptive marketing names, then what are they? Just like there are computer platforms like PC, PowerPC, MAC, PDP, etc., there are also software platforms.
A software platform, hereafter shortened to just platform, is software that supports other software (apps, services, etc.) as an intermediary between that software and the operating system. An application that is native to the operating system directly calls the API the operating system supplies to open files, send messages over the Internet, allocate memory, etc. A platform application calls the API exposed by a platform instead of the API exposed by the OS.
So why use platforms? The first obvious reason is to make your application independent of operating systems so that it can be used by all users. Examples of platforms that you use that do exactly this are web browsers, Adobe flash, Java, and .NET (well, .NET to some extent).
Some platforms are not meant to provide universal access to applications, but rather perform other services for applications for various reasons. For example, Android, OS X, and iOS are all platforms, and each performs certain functions such as garbage collection, providing a GUI framework, providing an app store, managing application permissions (access to things on your computer), etc.
Other examples of platforms, in a more general sense of the term, include web servers like Microsoft IIS or Apache Tomcat that host web applications and web services and provide various services to the hosted software including lifecycle management, load balancing, parallelism, etc.
So platforms perform very useful services that allow application development to be simplified and allow applications to run more smoothly.
It is important to note that there are two kinds of platforms, managed and unmanaged. There is a world of difference between these two kinds.
An unmanaged platform such as iOS provides an API and frameworks for supporting software, but the execution of that software is performed natively by the operating system. For example, an application written in Objective-C gets compiled into the machine language code of the architecture on which it is going to be executed just like an unmanaged application would.
A managed platform such as Java or .NET provides a virtual runtime environment or virtual machine that exposes both virtual instructions and an API that completely abstracts the underlying hardware and operating system. For example, a Java application runs inside a process that hosts a VM. When the Java application performs any operation it does so through a Java Virtual Machine (JVM) instruction that the virtual machine translates into a native CPU instruction. Although this might seem slower, and it was in the early days of Java (the 1990s), is can be done just as quick, and in fact, even quicker using various techniques including Just In Time Compiling.
[Side note: There may be times when executing native code is necessary, and there are mechanisms for doing that in managed platforms.]
Managed platforms offer a large number of benefits including preventing malicious or wayward applications from compromising a system (no core dumps or blue screens of death), sandboxing applications, garbage collection, portability of execution, standardized serialization, run-time optimization of code, better exception handling, etc. In my opinion, managed platforms are vastly superior to unmanaged platforms, which are so 20th century.
Android is Google's implementation of the Java managed platform and is designed for tables and smartphones. Sun Microsystem, now bought out by Oracle, created three versions of the Java platform. Java Standard Edition was the baseline that ran on desktops and servers, and it was awesome. Java Enterprise Edition added a lot of features geared towards big corporate IT needs, and almost all of it was terrible and led to a backlash and the term Plain Old Java Object or POJO.
Java Microedition or JME was Sun's attempt to scale down Java to run on phones, which at the time had very limited resources. This turned out to be a big mistake. Over the next 15 years the smartphone would become exponentially better, faster, and more capable. The progress in smartphone technology has been astonishingly fast. Today's smartphones are basically tiny versions of what desktop computers were in the start of this century. Sun's scaled down version of Java was essentially worthless and it didn't have to be.
Google took the opposite approach years later. Google realized that smartphones would be fast enough, have enough memory, have enough storage, have enough resolution to do the things that desktops could do. So Google started with Java Standard Edition and built it up rather than scaling it down. Android is essentially a version of Java SE that is customized for smartphones and tablets by adding support for all the hardware features in mobile computing like GPS, orientation, cameras, telephony functionality, etc.
The fact that Android is a managed platform and iOS is not is one of the reasons I consider Android to be vastly superior to iOS.
So, OS X and iOS are unmanaged platforms that run on top on an operating system, but what operating system?
The Operating Systems of Phones
There are multiple smartphone/PDA operating systems. You may have heard of Palm OS, one of the first for PDAs (hand-held computers that were popular in the 1990s). Palm OS was great for its time. Another OS that runs on smartphones is Windows CE. It was Microsoft's attempt to make the smartphone and tablet feel like a desktop. This turned out to be a mistake. Microsoft later made the opposite mistake by trying to turn the desktop into a tablet with Metro -- cringe.
You may be surprised to hear this, but your iPhone runs the same operating system as my Android. That operating system is called Unix.
Well, it's better to think of Unix as a family of operating systems. It used to be an operating system that was created by Bell Labs back in the 1960s, and it still has all the baggage from back then. But the code base has been forked, that is copied and diverged, so many times that there are many different versions of Unix, each best thought as a different operating system with a lot of common concepts, implementations, and history. For example, IBM has a version of UNIX called AIX. Sun Microsystems had Sun OS and later Solaris. Silicon Graphics, think of all those Pixar movies, has Irix. DEC had Ultrix.
And then one man, pissed off at how expensive UNIX was to license, created his own version named Linux. In 1991 Linus Torvalds created a royalty free version of UNIX that immediately became widely popular. By the late 1990s, Linux itself had been forked so many times that it itself was a family of operating systems called distributions or distros.
In the late 1990s when Apple realized it could not maintain its shitty operating system -- MAC OS was the operating system equivalent of Internet Explorer -- Apple decided to use
LinuxUNIX as the operating systems of all its products. This was the best decision Apple ever made, and damn they were lucky. If Linus Torvalds hadn't created Linux in 1991 and Linux hadn't become so popular, Apple would have died in the early 20th century. This is not an exaggeration.Apple simply could not maintain its crappy OS. All other operating systems supported multimedia and multithreading and a host of other things. Apple couldn't even implement multithreading into its OS, and that was a necessity in all modern computing.
By adopting
Linux, or if you prefer,BSD UNIX as its operating system of choice, Apple got everything for free. It could now do true pre-emptive multitasking with multithreading. It got much better multimedia support. And, most of all, Apple gained entry into every market including the critical smartphone market. Unix runs on smartphones. Apple's operating system, OS 9, did not and could not. For this reason alone, Apple should give half of its wealth to Linus Torvalds. He did far more for Apple than Steve Jobs ever did.OS X, in all its versions, is simply an unmanaged platform that runs on top of Unix to provide a desktop shell and an unmanaged API, most of which is a pass-through to the UNIX API. That's not to say that these things aren't valuable, but OS X is not an operating system in any sense of the term. Its name is a marketing lie, a deliberate misnomer.
iOS, likewise, is an unmanaged platform that runs on top of UNIX on iPhones. Android is a managed platform that runs on top of UNIX on various smartphones and tablets. If you look at the file system on an iPhone and an Android phone, you'll see the same directory structures, operating system command-lets, and Unix's equivalent of Windows' DLLs which are called shared objects and have the .so extension.
[Side note: I recommend Easy File Explorer for managing your file system on Android machines. It was the very first app I installed on my tablet and on my phone. I don't know if there is an iOS equivalent.]
Most of the time when non-professionals are talking about whether they like MAC or Windows, iOS or Android, they are really just talking about the desktop shells offered on those platforms or OSes. This is a very shallow way of judging a car. Sure, when you buy a car, it's nice to have the color you like. It's also nice to have a reliable, fuel efficient engine and a transmission that isn't going to break down when your in the middle of nowhere. The OS is what's under the hood. The desktop shell is like the dashboard. You can replace the dashboard of a 1984 Ford Ares with something more modern and appealing, but you're still driving a 1984 Ford Ares.
[Side note: Three out of five Ford Ares drivers have fixed addresses.]
The Future
Personally, I'd like to see an entirely new, modern operating system created. I don't mean a new revision of that crappy old Unix from the 1960s. Yes, there are reasons I'm no fan of Unix, but they go into technical details that few people would be interested in. Nor do I want a better version of Windows as it has too much baggage as well. And OS X, iOS, and Androids are platforms, not operating systems. Yes, platforms are important, but so are operating systems.
Over the past 20 years, most operating system development hasn't really been OS development. It's been changing fashions of desktop shells. Let's remove the borders and chrome around windows. Let's snap windows to half or a quarter of the screen. Let's show all our apps as tiles in a list. Let's provide an app store as a single point of buying and installing apps. These aren't real operating system issues.
You might be thinking that the operating system is irrelevant now, and to some extent you would be rights. Sharing documents and interacting is now all done over the Web, which is essentially a distributed platform. Most of the things you do from sending emails and IM to reading news to banking is done in a browser that is largely independent of the underlying operating system.
However, I believe there is a third generation of operating systems that needs to be created. The first generation of OSes were basically bundles of I/O functions loosely placed into an API of sort. The second generation of operating systems provided hardware management in the form of UNIX in mainframes and Windows NT and OS/2 in desktops. They dealt with memory management, processes lifecycles, graphic adapters and gaming, and eventually multithreading and networking.
The third generation of operating systems should provide management of applications, communication, and content. I'm not talking about low-level functions already provided by OSes like talking over the network. I mean high level management of the ecosystem we now live in. A third generation OS would provide every application or service with its own environment and prevent one application from affecting another application's or the OS's environment. For example, no virus could write its code into another application's code. You could see every application on your system and there would be no place to hide because each application would have to be registered in order to even exist on the file system, nonetheless execute.
A third generation OS would also ensure that applications didn't send your credit card information across the network without your permission. Encryption would be provided by the OS for all network traffic. It would make sure applications didn't call home without your knowledge and consent. It would prevent applications from snooping on your private information like your contact lists and what other applications you have installed. All inter-application and inter-process communication would be done through OS channels. It would prevent key loggers and any other kind of spyware.
A third generation OS would provide full content management of your content from photos to videos to saved games in a way that would both simplify tasks and empower the end user. I'm not just talking about backing things up to the cloud or providing specific folders for different kinds of content. I'm talking about managing all your content, protecting it from loss or unauthorized viewing, encrypting it, copying it between machines, sharing it with other people, and even version controlling and face-tagging, and searching it by similar content.
There are a lot of improvements to make to the operating system layer. Why is it that we only get different color pixels every few years?
#environment