4
0

Apple has not made an operating system in this entire millennium.


               
2015 May 28, 5:27pm   22,007 views  61 comments

by Dan8267   follow (4)  

There has for some time been much confusion about what an operating system is and what makes a good one. So I've taken the time to offer a bit of clarity.

Operating System

An operating system is the software that operates the system. It's a self-defining term. An operating system does a lot of important stuff that most people are completely ignorant of and don't give a damn about except that those things affect whether or not the computer works and how fast it is. The system that the OS operates is the hardware and many OS entities that support running applications and other software on your computer. Such entities include processes, threads, file handles, network stacks, memory blocks, and so forth.

The operating system essentially does two things. It handles the operating the hardware and managing resources. This is why it's called an operating system. The second, and equally important, thing that an OS does is act as an Application Programming Interface or API for all other software so that the other software can run and interact with the hardware as well as each other. In Windows, there used to be an API called Win16. That's now retired. Today Windows offers two distinct APIs called Win32 and Win64, which are 32-bit and 64-bit versions.

Windows 3.1 and earlier were not operating systems, but rather graphical shells that ran on top of the operating system called DOS. OS X and iOS also are NOT operating systems but rather platforms -- I'll define that soon -- that run on top of the operating system. The last operating system that Apple wrote was OS 9 in 1999, which was such a flaming piece of shit that Apple decide to stop even trying to write OSes. This was in fact one of the best decisions Apple ever made, and I mean for Apple. The other best decision Apple ever made was to stop making hardware and simply use PCs.

PC or MAC

The term Personal Computer or PC originally meant any computer an individual could earn. It had to be cheap and small compared to the mainframes and minicomputers. A minicomputer was barely the size of a refrigerator. In the 1970s, this was considered small. PCs were any microcomputer. Today we don't use the term microcomputer, but rather desktop and tower. Later came the laptop, tablet, and phone form factors.

By 1984 the term PC did not mean any computer owned by an individual because the entire microcomputer industry was boiled down to two players, Intel and Apple. PC then meant any computer based on the Intel x86 line such as the 8086, 80286, 80386, 80486, 80586 or Pentium I, etc. Intel stopped using the 80x86 names because it couldn't trademark them due to a court ruling that they were descriptions. So, in order to market against AMD, another company that made x86 CPUs, Intel trademarked "Pentium" and started calling their chips by that name, which is why today it's far more confusing which chips precede which.

On the Apple side, Apple made computers like the Apple I and Apple ][ and eventually the Lisa, which quite frankly were armature hobbyist quality. These were the only true Apple computers in the fullest sense. This all changed when the executives at Xerox corporation, the makers of photocopiers, made one of the stupidest mistakes in history. They showed Steve Jobs the work of the great Douglas Engelbart.

Douglas Engelbart invented the mouse, the graphical user interface, and Ethernet. His inventions were decades before their time. Steve Jobs saw these inventions and, as was common in the early days of computing, ripped them off wholesale. And the Apple equivalent of the graphical user interface was a piece of shit compared to Engelbart's already completed version.

So Apple went on to build the Macintosh, which used the Motorola architecture. Motorola is a company with a long history of embedded computing (think electronic security doors, building automation, cell phones, processors in various parts of automobiles, etc.). Specifically all true Macintoshes, i.e. before the PowerPC, used Motorola 680x0 chips from 68000 to 68040.

However, Apple wasn't able to compete well against the Windows/PC (or Microsoft/Intel) alliance using the Motorola chips. The Intel chips were just better for desktop computing. Not that the Motorola weren't great for embedded devices, but Intel offered faster machines that could do more.

So Apple and IBM got together and tried a new idea, Reduced Instruction Set Code or RISC. The idea was to eliminate most operations a CPU could do, so that the CPU's clock speed could be drastically increase. The CPU would have to execute more instructions to do what an Intel chip could do with one complex instruction, but the CPU hopefully could do each simple instruction much faster. However, RISC turned out to be a flop. First, the Intel chips were always clocked faster than the PowerPC chips, completely defeating the purpose of the entire RISC architecture. Second, the complex instructions were just quite frankly awesome and made a lot of things like playing and editing video possible.

So Apple abandoned the PowerPC, which despite its deceptive marketing name was not a PC, and simply moved to the PC platform. Remember, the term PC and the term Windows are completely different. Windows is an operating system that runs on PCs and other devices (phones), and PC is a computer architecture that can run a multitude of operating systems including Windows, OS/2, OS X, Solaris, Linux, etc.

Now using PCs as the basis for their products was a great improvement. Intel released the MMX and later XXM instruction set, which are CPU instructions that deal with massively parallel arithmetic operations designed to support multimedia. These instructions are beyond awesome. Unless you've written multimedia code in C, ported to assembly to double its speed, and then wrote XXM code to increase the speed by two to three orders of decimal magnitude, you don't comprehend how awesome these instructions are. I do not exaggerate when I say that without these instructions, video playback and certainly video editing would be impossible on computers. And Apple made the transitions to PCs right when multimedia was taking off. If it hadn't, the entire Apple experience would have ended.

Today, the only computers that you use that are not PCs are your tablets and phones. Well, some tablets are PCs, but others like the Google Nexus 10 are not. Intel has recently entered the smartphone market, but I don't know if their chips are following the x86 architecture or not, and Intel has a lot of catching up to do. I'm sure the executives at Intel are kicking themselves, or more likely their engineers, for missing the smartphone boat ten years ago.

Hopefully at this point you have a firm grasp of what an operating system is and what a computer platform is, and what the differences between the two are. There is a many to many relationship between OSes and computer platforms. A single platform like the PC can run many operating systems and can even have dual or multi-boot to different installed operating systems. Similarly, any operating system can be ported to multiple computer platforms. Typically operating systems have some kind of hardware abstraction layer or HAL to make this easier.

Software Platforms

So if OS X and iOS are not operating systems, despite their deliberately deceptive marketing names, then what are they? Just like there are computer platforms like PC, PowerPC, MAC, PDP, etc., there are also software platforms.

A software platform, hereafter shortened to just platform, is software that supports other software (apps, services, etc.) as an intermediary between that software and the operating system. An application that is native to the operating system directly calls the API the operating system supplies to open files, send messages over the Internet, allocate memory, etc. A platform application calls the API exposed by a platform instead of the API exposed by the OS.

So why use platforms? The first obvious reason is to make your application independent of operating systems so that it can be used by all users. Examples of platforms that you use that do exactly this are web browsers, Adobe flash, Java, and .NET (well, .NET to some extent).

Some platforms are not meant to provide universal access to applications, but rather perform other services for applications for various reasons. For example, Android, OS X, and iOS are all platforms, and each performs certain functions such as garbage collection, providing a GUI framework, providing an app store, managing application permissions (access to things on your computer), etc.

Other examples of platforms, in a more general sense of the term, include web servers like Microsoft IIS or Apache Tomcat that host web applications and web services and provide various services to the hosted software including lifecycle management, load balancing, parallelism, etc.

So platforms perform very useful services that allow application development to be simplified and allow applications to run more smoothly.

It is important to note that there are two kinds of platforms, managed and unmanaged. There is a world of difference between these two kinds.

An unmanaged platform such as iOS provides an API and frameworks for supporting software, but the execution of that software is performed natively by the operating system. For example, an application written in Objective-C gets compiled into the machine language code of the architecture on which it is going to be executed just like an unmanaged application would.

A managed platform such as Java or .NET provides a virtual runtime environment or virtual machine that exposes both virtual instructions and an API that completely abstracts the underlying hardware and operating system. For example, a Java application runs inside a process that hosts a VM. When the Java application performs any operation it does so through a Java Virtual Machine (JVM) instruction that the virtual machine translates into a native CPU instruction. Although this might seem slower, and it was in the early days of Java (the 1990s), is can be done just as quick, and in fact, even quicker using various techniques including Just In Time Compiling.

[Side note: There may be times when executing native code is necessary, and there are mechanisms for doing that in managed platforms.]

Managed platforms offer a large number of benefits including preventing malicious or wayward applications from compromising a system (no core dumps or blue screens of death), sandboxing applications, garbage collection, portability of execution, standardized serialization, run-time optimization of code, better exception handling, etc. In my opinion, managed platforms are vastly superior to unmanaged platforms, which are so 20th century.

Android is Google's implementation of the Java managed platform and is designed for tables and smartphones. Sun Microsystem, now bought out by Oracle, created three versions of the Java platform. Java Standard Edition was the baseline that ran on desktops and servers, and it was awesome. Java Enterprise Edition added a lot of features geared towards big corporate IT needs, and almost all of it was terrible and led to a backlash and the term Plain Old Java Object or POJO.

Java Microedition or JME was Sun's attempt to scale down Java to run on phones, which at the time had very limited resources. This turned out to be a big mistake. Over the next 15 years the smartphone would become exponentially better, faster, and more capable. The progress in smartphone technology has been astonishingly fast. Today's smartphones are basically tiny versions of what desktop computers were in the start of this century. Sun's scaled down version of Java was essentially worthless and it didn't have to be.

Google took the opposite approach years later. Google realized that smartphones would be fast enough, have enough memory, have enough storage, have enough resolution to do the things that desktops could do. So Google started with Java Standard Edition and built it up rather than scaling it down. Android is essentially a version of Java SE that is customized for smartphones and tablets by adding support for all the hardware features in mobile computing like GPS, orientation, cameras, telephony functionality, etc.

The fact that Android is a managed platform and iOS is not is one of the reasons I consider Android to be vastly superior to iOS.

So, OS X and iOS are unmanaged platforms that run on top on an operating system, but what operating system?

The Operating Systems of Phones

There are multiple smartphone/PDA operating systems. You may have heard of Palm OS, one of the first for PDAs (hand-held computers that were popular in the 1990s). Palm OS was great for its time. Another OS that runs on smartphones is Windows CE. It was Microsoft's attempt to make the smartphone and tablet feel like a desktop. This turned out to be a mistake. Microsoft later made the opposite mistake by trying to turn the desktop into a tablet with Metro -- cringe.

You may be surprised to hear this, but your iPhone runs the same operating system as my Android. That operating system is called Unix.

Well, it's better to think of Unix as a family of operating systems. It used to be an operating system that was created by Bell Labs back in the 1960s, and it still has all the baggage from back then. But the code base has been forked, that is copied and diverged, so many times that there are many different versions of Unix, each best thought as a different operating system with a lot of common concepts, implementations, and history. For example, IBM has a version of UNIX called AIX. Sun Microsystems had Sun OS and later Solaris. Silicon Graphics, think of all those Pixar movies, has Irix. DEC had Ultrix.

And then one man, pissed off at how expensive UNIX was to license, created his own version named Linux. In 1991 Linus Torvalds created a royalty free version of UNIX that immediately became widely popular. By the late 1990s, Linux itself had been forked so many times that it itself was a family of operating systems called distributions or distros.

In the late 1990s when Apple realized it could not maintain its shitty operating system -- MAC OS was the operating system equivalent of Internet Explorer -- Apple decided to use Linux UNIX as the operating systems of all its products. This was the best decision Apple ever made, and damn they were lucky. If Linus Torvalds hadn't created Linux in 1991 and Linux hadn't become so popular, Apple would have died in the early 20th century. This is not an exaggeration.

Apple simply could not maintain its crappy OS. All other operating systems supported multimedia and multithreading and a host of other things. Apple couldn't even implement multithreading into its OS, and that was a necessity in all modern computing.

By adopting Linux, or if you prefer, BSD UNIX as its operating system of choice, Apple got everything for free. It could now do true pre-emptive multitasking with multithreading. It got much better multimedia support. And, most of all, Apple gained entry into every market including the critical smartphone market. Unix runs on smartphones. Apple's operating system, OS 9, did not and could not. For this reason alone, Apple should give half of its wealth to Linus Torvalds. He did far more for Apple than Steve Jobs ever did.

OS X, in all its versions, is simply an unmanaged platform that runs on top of Unix to provide a desktop shell and an unmanaged API, most of which is a pass-through to the UNIX API. That's not to say that these things aren't valuable, but OS X is not an operating system in any sense of the term. Its name is a marketing lie, a deliberate misnomer.

iOS, likewise, is an unmanaged platform that runs on top of UNIX on iPhones. Android is a managed platform that runs on top of UNIX on various smartphones and tablets. If you look at the file system on an iPhone and an Android phone, you'll see the same directory structures, operating system command-lets, and Unix's equivalent of Windows' DLLs which are called shared objects and have the .so extension.

[Side note: I recommend Easy File Explorer for managing your file system on Android machines. It was the very first app I installed on my tablet and on my phone. I don't know if there is an iOS equivalent.]

Most of the time when non-professionals are talking about whether they like MAC or Windows, iOS or Android, they are really just talking about the desktop shells offered on those platforms or OSes. This is a very shallow way of judging a car. Sure, when you buy a car, it's nice to have the color you like. It's also nice to have a reliable, fuel efficient engine and a transmission that isn't going to break down when your in the middle of nowhere. The OS is what's under the hood. The desktop shell is like the dashboard. You can replace the dashboard of a 1984 Ford Ares with something more modern and appealing, but you're still driving a 1984 Ford Ares.

[Side note: Three out of five Ford Ares drivers have fixed addresses.]

The Future

Personally, I'd like to see an entirely new, modern operating system created. I don't mean a new revision of that crappy old Unix from the 1960s. Yes, there are reasons I'm no fan of Unix, but they go into technical details that few people would be interested in. Nor do I want a better version of Windows as it has too much baggage as well. And OS X, iOS, and Androids are platforms, not operating systems. Yes, platforms are important, but so are operating systems.

Over the past 20 years, most operating system development hasn't really been OS development. It's been changing fashions of desktop shells. Let's remove the borders and chrome around windows. Let's snap windows to half or a quarter of the screen. Let's show all our apps as tiles in a list. Let's provide an app store as a single point of buying and installing apps. These aren't real operating system issues.

You might be thinking that the operating system is irrelevant now, and to some extent you would be rights. Sharing documents and interacting is now all done over the Web, which is essentially a distributed platform. Most of the things you do from sending emails and IM to reading news to banking is done in a browser that is largely independent of the underlying operating system.

However, I believe there is a third generation of operating systems that needs to be created. The first generation of OSes were basically bundles of I/O functions loosely placed into an API of sort. The second generation of operating systems provided hardware management in the form of UNIX in mainframes and Windows NT and OS/2 in desktops. They dealt with memory management, processes lifecycles, graphic adapters and gaming, and eventually multithreading and networking.

The third generation of operating systems should provide management of applications, communication, and content. I'm not talking about low-level functions already provided by OSes like talking over the network. I mean high level management of the ecosystem we now live in. A third generation OS would provide every application or service with its own environment and prevent one application from affecting another application's or the OS's environment. For example, no virus could write its code into another application's code. You could see every application on your system and there would be no place to hide because each application would have to be registered in order to even exist on the file system, nonetheless execute.

A third generation OS would also ensure that applications didn't send your credit card information across the network without your permission. Encryption would be provided by the OS for all network traffic. It would make sure applications didn't call home without your knowledge and consent. It would prevent applications from snooping on your private information like your contact lists and what other applications you have installed. All inter-application and inter-process communication would be done through OS channels. It would prevent key loggers and any other kind of spyware.

A third generation OS would provide full content management of your content from photos to videos to saved games in a way that would both simplify tasks and empower the end user. I'm not just talking about backing things up to the cloud or providing specific folders for different kinds of content. I'm talking about managing all your content, protecting it from loss or unauthorized viewing, encrypting it, copying it between machines, sharing it with other people, and even version controlling and face-tagging, and searching it by similar content.

There are a lot of improvements to make to the operating system layer. Why is it that we only get different color pixels every few years?

#environment

« First        Comments 41 - 61 of 61        Search these comments

41   curious2   2015 May 30, 3:39am  

Dan8267 says

Apple got everything for free.

IOW, "good artists copy, great artists steal." If you want to become the most valuable corporation in the world, then instead of trying to build a new OS from scratch, take a free one and make it your own. From a customer POV, Apple has the best OS. I say this as a person who prefers more features, even though they introduce more complexity: Apple curates only the most valuable features and qualities, because Apple has taste. I like my non-Apple products, but Apple customers love Apple products, and that makes a huge difference in quality of life.

42   justme   2015 May 30, 9:01am  

That word "curate" seems to be trending up in all kinds of contexts recently. Is it now a weasel-word for theft, copying and plagiarism?

43   Strategist   2015 May 30, 9:23am  

curious2 says

Dan8267 says

Apple got everything for free.

IOW, "good artists copy, great artists steal." If you want to become the most valuable corporation in the world, then instead of trying to build a new OS from scratch, take a free one and make it your own. From a customer POV, Apple has the best OS. I say this as a person who prefers more features, even though they introduce more complexity: Apple curates only the most valuable features and qualities, because Apple has taste. I like my non-Apple products, but Apple customers love Apple products, and that makes a huge difference in quality of life.

They had heated debates between on which was better - iPhone or Blackberry. I don't hear that anymore, as Blackberry lies in ruins. The same with Samsung now.
It's the consumer that gets to pick the winners, no one else.

44   Dan8267   2015 May 30, 9:34am  

justme says

You make it sound like Apple switched to x86 BEFORE Intel came out with MMX.

I did not state or imply any such thing. I have no idea where you got that impression.

justme says

And there is no such thing as "XXM"

Back in the 1990s it was referred to as XXM just like the registers to distinguish from the original MMX set. Perhaps a misnomer that's been long corrected. It's been over 15 years since I did assembly language programming and I haven't kept up with the terms. The literature at the time did use the term XXM.

justme says

The first several generations of PowerPC in the 1980s-1990s were far ahead of Intel, performance-wise and at times also clock-wise.

PowerPCs in the 1980s? The first PowerPCs were introduced to the public in late 1993. As for clock speeds, both sides leapfrogged each other for a while, but the PowerPC topped out at about 3 GHz and x86 topped out at about 4 GHz -- well, recently it reached 5 GHz, but the point is that the great CPU clock speed increases are a thing of the past. So this invalidates the RISC approach. Instead, if you want a faster computer, you increase the CPU and the core count and do parallel processing.

Why Apple Really Ditched PowerPC

But then things slowly started turning dark. At MacWorld '03, instead of delivering his promised 3 GHz systems, Steve Jobs's announcement topped out at 2.5 GHz, and the G5 was getting nowhere near the PowerBook line, running too hot and drawing far too much power. The G5 was failing to meet its expectations.

At MacWorld '05, Apple once again upgraded its Power Mac line, this time pushing the chip just 200 MHz faster than it had been a year ago, the elusive 3 GHz still nowhere in sight two years after it was promised. The G5 was still nowhere near getting into the PowerBook, and vague rumors of a new PowerPC core for portables cropped up. The G4 had practically stalled out at this point and the G5 was looking more like a stop-gap and less like a real solution to Apple's chip problems.

Apple had stuck to offering dual processors to keep competitive with Intel and AMD; now, as this tactic was running out of steam, IBM released dual-core versions of the G5 in order to offer Apple the benefits of symmetric multi-processing without the engineering overhead for a second chip. The result was the PowerPC 970MP, which per core ran slower than previous versions but, as a consolation, offered twice the cache. At this point, it became evident that IBM had seriously failed to meet its goals for the G5.

Apple was now in the same position it had been in five years earlier, with a processor that was falling behind and needing a new chip supplier desperately. Since moving away from the G4, Motorola had spun its chip division off into Freescale, with a G4 whose clock hadn't budged in years and shadowy promises of 64-bit dual-core chips that weren't even ready for sampling. Apple's portable line had been hit the hardest, standing still with the G4 while Pentium derivatives cleaned house. Apple was in dire straits.

The PowerPC was a worthy experiment to run, but Intel's x86 architecture won out. And with clock speed no longer being the determining factor in computer speed, the use of reduced instruction sets simply does not make sense.

45   Dan8267   2015 May 30, 9:38am  

curious2 says

"good artists copy, great artists steal." I

If you look at the history of IT from 1980 to today, it's all about copying other people's ideas. All the big players did it. And it's still going on. Remember when Bing used Google search results? It's a shame that a lot of small to mid-size companies got fucked over in the process though.

Today the business model seems to be, start a small business and hope to get bought out by a large one.

46   Dan8267   2015 May 30, 9:42am  

Strategist says

It's the consumer that gets to pick the winners, no one else.

And they picked the x86 PC over both the MAC Motorola and the PowerPC.

47   Strategist   2015 May 30, 9:54am  

Dan8267 says

Strategist says

It's the consumer that gets to pick the winners, no one else.

And they picked the x86 PC over both the MAC Motorola and the PowerPC.

The consumer was not allowed to pick winners at that time. Microsoft was, and to a certain extent still is a Monopoly.

48   Dan8267   2015 May 30, 10:11am  

Strategist says

The consumer was not allowed to pick winners at that time. Microsoft was, and to a certain extent still is a Monopoly.

Honey, you either believe in the free market or you don't. The consumer never has unfettered choice.

And Microsoft is not Intel. Microsoft doesn't make x86 CPUs.

49   hanera   2015 May 30, 10:30am  

Dan8267 says

The PowerPC was a worthy experiment to run, but Intel's x86 architecture won out. And with clock speed no longer being the determining factor in computer speed, the use of reduced instruction sets simply does not make sense.

I see you just refuse to change position. As pointed out by Justme, Intel won because of wafer technology and NOT CISC is better than RISC. RISC is now used in all iOS devices. If I'm intel, would be worried because what if Apple uses ARM RISC in Mac!!! Wonder which processor architecture, RISC or CISC, is most popular for Android?

50   hanera   2015 May 30, 10:32am  

Dan8267 says

Strategist says

It's the consumer that gets to pick the winners, no one else.

And they picked the x86 PC over both the MAC Motorola and the PowerPC.

Living in the past again.

51   hanera   2015 May 30, 10:42am  

Future OS concepts - You talked like BG. Everything critical belongs to OS.

52   Bellingham Bill   2015 May 30, 11:13am  

hanera says

what if Apple uses ARM RISC in Mac

As long as Intel is a node or two ahead of everyone, this isn't going to happen. Plus this is a Windows world still, and being compatible here has strategic advantages, e.g. the orders Apple is getting from IBM for laptops.

Dan8267 says

And with clock speed no longer being the determining factor in computer speed, the use of reduced instruction sets simply does not make sense.

yeah, well, there's no magic that made x86 ISA better. Starting from scratch nobody in their right mind would end up with that.

(Just like all intel efforts I must add, from USB HID to PC DSDTs)

x86 has the singular advantage of being the legacy ISA that is compatible with 30+ years of industry development (only relatively recently did Linux ditch compatibility with 386), and of course the binary ISA of the platform "OS" that 90%+ of the planet uses

Apple didn't need x86 compatibility last decade when working on the iPhone, so they didn't go to Intel with an order for x86 parts.

You know what ARM stands for?

53   Bellingham Bill   2015 May 30, 11:24am  

The funny thing though is that Intel solved its x86 problem by just using it as an on-disk IL/bytecode target that got reworked into a proper RISC-ish ISA before it hit the execution units.

This is the best way to go, take a sane CISC like 68k ColdFire ISA and use that as your bytecode, and let each CPU optimize it, since compilers sure as hell can't optimize for all the different variations of CPUs the binary they're creating might encounter eventually.

So the whole PPC thing was largely unnecessary, Apple could have stuck with 68K right until now, if AIM could have kept up with INTC at the fab level.

Unfortunately, Apple's success with iOS came too late to save the Mac from Intel's embrace, though it must be said being compatible with the rest of the world has its benefits, once x86 outgrew so many of its horrible flaws.

54   Bellingham Bill   2015 May 30, 1:19pm  

Another funny thing is that I'm not aware of any x86 arcade games of the 80s or 90s, at least not until 3Dfx and DirectX were established on the PC platform, which gave arcade makers time-to-market advantage and hw cost savings going that way.

6809 and 68000 were standard. The one arcade game I had the privilege of semi-contributing to in the 90s used a 68000 for the logic and graphics display and a TMS34010 for the 2D graphics overlay.

shows Intel's penetration of the CPU design of arcade games.

yeah, my fiery hatred of x86 has only been slightly moderated by having to use the damn things, 2005 - now.

Luckily I work so far up the stack the ISA is completely irrelevant. I actually haven't even had to look at assembly for over 20 years.

55   MisdemeanorRebel   2015 May 30, 2:07pm  

Some seem to forget the Linux-based Android is, far and away, the dominant mobile OS by market share, especially beyond the USA.

It wasn’t long ago that Android overtook iOS as the most popular mobile platform in the United States, but as the release of Apple’s first large screen phone approaches, Android’s lead is beginning to look insurmountable. According to the latest data from Kantar Worldpanel ComTech, Android now holds 61.9% of the U.S. market share to Apple’s 32.5%, the lowest percentage iOS has captured since the iPhone 4s launched in 2011.

Internationally, Android is even more ubiquitous, with 82.7% market share in China and 73.3% across Europe in countries including the UK, France, Germany Spain and Italy. That’s not to say Apple isn’t showing signs of life — ComTech also notes that the iPhone 5s and iPhone 5c were still the top selling phones in Britain during the month of May, even following a hugely successful launch for the Galaxy S5. On the other hand, 17% of customers who purchased a Galaxy S5 were switching from iOS to Android.


http://bgr.com/2014/07/01/android-market-share-2014/

States should pass a law requiring State Universities to make a fiscal case for staying in the expensive Apple Ecosystem. R and R Studio don't run on Linux? 4D is a good, modern database? What is the costs between a Linux Built and a Proprietary Mac Desktop/Laptop with the similar Memory, CPU, Graphics, etc. components (ho,ho, ho)? The University servers don't already run on Unix?

56   bob2356   2015 May 30, 6:02pm  

Dan8267 says

Strategist says

It's the consumer that gets to pick the winners, no one else.

And they picked the x86 PC over both the MAC Motorola and the PowerPC.

What are you kidding me? Apple was never more than a nitch player in the computer market. Apple switching to the x86 affected a whopping 3-4% of the market tops. Not exactly a consumer market stampede. Consumers picked the windows market and didn't give a shit in any way shape or from what the chip was. If IBM had picked the 68000, which they might have, then the entire pc market would be motorola right now and almost no one would know the difference. The 68000 peripheral chipset wasn't going to be ready in time for the original pc release date by at least 6 months so it just wasn't a consideration. IBM might have gone 8088 no matter what, they used the 8085 in the datamaster and were familiar with it, but if there were fully functional 68000 to compare to it would have been a tough call. On the other hand I've read several articles that said the 8088 8 bit was chosen because a 16 bit chip 8086 or 68000 would have generated serious political opposition from the minicomputer division.

Why are you so fixated on apple?

57   HydroCabron   2015 May 30, 7:08pm  

Is the FED behind this?

1995 called. They want their heated online platform discussion back.

58   Bellingham Bill   2015 May 30, 9:37pm  

Jesus, I'm *am* hitting my 20th anniversary for arguing on the internet just around now.

: (

59   Dan8267   2015 May 31, 10:02pm  

hanera says

Intel won because of wafer technology and NOT CISC is better than RISC.

Given that the industry has hit a wall in clock speeds, what makes you think that a reduced instruction set is better than a rich one?

bob2356 says

Why are you so fixated on apple?

I'm not. I covered quite a bit. But when you only have two brands left, Coke and Pepsi, it's hard not to mention one.

bob2356 says

And they picked the x86 PC over both the MAC Motorola and the PowerPC.

What are you kidding me? Apple was never more than a nitch player in the computer market. Apple switching to the x86 affected a whopping 3-4% of the market tops.

Doesn't that prove my point that the consumer based chose the x86 line over several alternatives?

60   hanera   2015 Jun 1, 12:33am  

Dan8267 says

hanera says

Intel won because of wafer technology and NOT CISC is better than RISC.

Given that the industry has hit a wall in clock speeds, what makes you think that a reduced instruction set is better than a rich one?

Hey, I didn't say that :wink:.

61   bob2356   2015 Jun 1, 6:46am  

Dan8267 says

bob2356 says

And they picked the x86 PC over both the MAC Motorola and the PowerPC.

What are you kidding me? Apple was never more than a nitch player in the computer market. Apple switching to the x86 affected a whopping 3-4% of the market tops.

Doesn't that prove my point that the consumer based chose the x86 line over several alternatives?

Not at all, consumers chose MS/DOS. If IBM had gone with the 68000 and apple with the 8088 the pc world would have been all motorola. The vast majority of pc's were sold to business, especially in the first 5 years. Business managers wanted the magic phrase IBM which also meant MS/DOS. Once MS/DOS was established in the business world, and for business people wanting a machine for home apples was relegated to a couple nitch markets. No one ever cared what chip was inside other than you and intel.

I, and lots of other people, never bought jobs official reasons for switching chips. Amd had faster chips than intel. Gamers loved amd. I think intel made jobs a killer deal on price to knock ibm/motorola out. Consumers had zero to do with it.

« First        Comments 41 - 61 of 61        Search these comments

Please register to comment:

api   best comments   contact   latest images   memes   one year ago   users   suggestions   gaiste