4
0

Apple has not made an operating system in this entire millennium.


               
2015 May 28, 5:27pm   22,000 views  61 comments

by Dan8267   follow (4)  

There has for some time been much confusion about what an operating system is and what makes a good one. So I've taken the time to offer a bit of clarity.

Operating System

An operating system is the software that operates the system. It's a self-defining term. An operating system does a lot of important stuff that most people are completely ignorant of and don't give a damn about except that those things affect whether or not the computer works and how fast it is. The system that the OS operates is the hardware and many OS entities that support running applications and other software on your computer. Such entities include processes, threads, file handles, network stacks, memory blocks, and so forth.

The operating system essentially does two things. It handles the operating the hardware and managing resources. This is why it's called an operating system. The second, and equally important, thing that an OS does is act as an Application Programming Interface or API for all other software so that the other software can run and interact with the hardware as well as each other. In Windows, there used to be an API called Win16. That's now retired. Today Windows offers two distinct APIs called Win32 and Win64, which are 32-bit and 64-bit versions.

Windows 3.1 and earlier were not operating systems, but rather graphical shells that ran on top of the operating system called DOS. OS X and iOS also are NOT operating systems but rather platforms -- I'll define that soon -- that run on top of the operating system. The last operating system that Apple wrote was OS 9 in 1999, which was such a flaming piece of shit that Apple decide to stop even trying to write OSes. This was in fact one of the best decisions Apple ever made, and I mean for Apple. The other best decision Apple ever made was to stop making hardware and simply use PCs.

PC or MAC

The term Personal Computer or PC originally meant any computer an individual could earn. It had to be cheap and small compared to the mainframes and minicomputers. A minicomputer was barely the size of a refrigerator. In the 1970s, this was considered small. PCs were any microcomputer. Today we don't use the term microcomputer, but rather desktop and tower. Later came the laptop, tablet, and phone form factors.

By 1984 the term PC did not mean any computer owned by an individual because the entire microcomputer industry was boiled down to two players, Intel and Apple. PC then meant any computer based on the Intel x86 line such as the 8086, 80286, 80386, 80486, 80586 or Pentium I, etc. Intel stopped using the 80x86 names because it couldn't trademark them due to a court ruling that they were descriptions. So, in order to market against AMD, another company that made x86 CPUs, Intel trademarked "Pentium" and started calling their chips by that name, which is why today it's far more confusing which chips precede which.

On the Apple side, Apple made computers like the Apple I and Apple ][ and eventually the Lisa, which quite frankly were armature hobbyist quality. These were the only true Apple computers in the fullest sense. This all changed when the executives at Xerox corporation, the makers of photocopiers, made one of the stupidest mistakes in history. They showed Steve Jobs the work of the great Douglas Engelbart.

Douglas Engelbart invented the mouse, the graphical user interface, and Ethernet. His inventions were decades before their time. Steve Jobs saw these inventions and, as was common in the early days of computing, ripped them off wholesale. And the Apple equivalent of the graphical user interface was a piece of shit compared to Engelbart's already completed version.

So Apple went on to build the Macintosh, which used the Motorola architecture. Motorola is a company with a long history of embedded computing (think electronic security doors, building automation, cell phones, processors in various parts of automobiles, etc.). Specifically all true Macintoshes, i.e. before the PowerPC, used Motorola 680x0 chips from 68000 to 68040.

However, Apple wasn't able to compete well against the Windows/PC (or Microsoft/Intel) alliance using the Motorola chips. The Intel chips were just better for desktop computing. Not that the Motorola weren't great for embedded devices, but Intel offered faster machines that could do more.

So Apple and IBM got together and tried a new idea, Reduced Instruction Set Code or RISC. The idea was to eliminate most operations a CPU could do, so that the CPU's clock speed could be drastically increase. The CPU would have to execute more instructions to do what an Intel chip could do with one complex instruction, but the CPU hopefully could do each simple instruction much faster. However, RISC turned out to be a flop. First, the Intel chips were always clocked faster than the PowerPC chips, completely defeating the purpose of the entire RISC architecture. Second, the complex instructions were just quite frankly awesome and made a lot of things like playing and editing video possible.

So Apple abandoned the PowerPC, which despite its deceptive marketing name was not a PC, and simply moved to the PC platform. Remember, the term PC and the term Windows are completely different. Windows is an operating system that runs on PCs and other devices (phones), and PC is a computer architecture that can run a multitude of operating systems including Windows, OS/2, OS X, Solaris, Linux, etc.

Now using PCs as the basis for their products was a great improvement. Intel released the MMX and later XXM instruction set, which are CPU instructions that deal with massively parallel arithmetic operations designed to support multimedia. These instructions are beyond awesome. Unless you've written multimedia code in C, ported to assembly to double its speed, and then wrote XXM code to increase the speed by two to three orders of decimal magnitude, you don't comprehend how awesome these instructions are. I do not exaggerate when I say that without these instructions, video playback and certainly video editing would be impossible on computers. And Apple made the transitions to PCs right when multimedia was taking off. If it hadn't, the entire Apple experience would have ended.

Today, the only computers that you use that are not PCs are your tablets and phones. Well, some tablets are PCs, but others like the Google Nexus 10 are not. Intel has recently entered the smartphone market, but I don't know if their chips are following the x86 architecture or not, and Intel has a lot of catching up to do. I'm sure the executives at Intel are kicking themselves, or more likely their engineers, for missing the smartphone boat ten years ago.

Hopefully at this point you have a firm grasp of what an operating system is and what a computer platform is, and what the differences between the two are. There is a many to many relationship between OSes and computer platforms. A single platform like the PC can run many operating systems and can even have dual or multi-boot to different installed operating systems. Similarly, any operating system can be ported to multiple computer platforms. Typically operating systems have some kind of hardware abstraction layer or HAL to make this easier.

Software Platforms

So if OS X and iOS are not operating systems, despite their deliberately deceptive marketing names, then what are they? Just like there are computer platforms like PC, PowerPC, MAC, PDP, etc., there are also software platforms.

A software platform, hereafter shortened to just platform, is software that supports other software (apps, services, etc.) as an intermediary between that software and the operating system. An application that is native to the operating system directly calls the API the operating system supplies to open files, send messages over the Internet, allocate memory, etc. A platform application calls the API exposed by a platform instead of the API exposed by the OS.

So why use platforms? The first obvious reason is to make your application independent of operating systems so that it can be used by all users. Examples of platforms that you use that do exactly this are web browsers, Adobe flash, Java, and .NET (well, .NET to some extent).

Some platforms are not meant to provide universal access to applications, but rather perform other services for applications for various reasons. For example, Android, OS X, and iOS are all platforms, and each performs certain functions such as garbage collection, providing a GUI framework, providing an app store, managing application permissions (access to things on your computer), etc.

Other examples of platforms, in a more general sense of the term, include web servers like Microsoft IIS or Apache Tomcat that host web applications and web services and provide various services to the hosted software including lifecycle management, load balancing, parallelism, etc.

So platforms perform very useful services that allow application development to be simplified and allow applications to run more smoothly.

It is important to note that there are two kinds of platforms, managed and unmanaged. There is a world of difference between these two kinds.

An unmanaged platform such as iOS provides an API and frameworks for supporting software, but the execution of that software is performed natively by the operating system. For example, an application written in Objective-C gets compiled into the machine language code of the architecture on which it is going to be executed just like an unmanaged application would.

A managed platform such as Java or .NET provides a virtual runtime environment or virtual machine that exposes both virtual instructions and an API that completely abstracts the underlying hardware and operating system. For example, a Java application runs inside a process that hosts a VM. When the Java application performs any operation it does so through a Java Virtual Machine (JVM) instruction that the virtual machine translates into a native CPU instruction. Although this might seem slower, and it was in the early days of Java (the 1990s), is can be done just as quick, and in fact, even quicker using various techniques including Just In Time Compiling.

[Side note: There may be times when executing native code is necessary, and there are mechanisms for doing that in managed platforms.]

Managed platforms offer a large number of benefits including preventing malicious or wayward applications from compromising a system (no core dumps or blue screens of death), sandboxing applications, garbage collection, portability of execution, standardized serialization, run-time optimization of code, better exception handling, etc. In my opinion, managed platforms are vastly superior to unmanaged platforms, which are so 20th century.

Android is Google's implementation of the Java managed platform and is designed for tables and smartphones. Sun Microsystem, now bought out by Oracle, created three versions of the Java platform. Java Standard Edition was the baseline that ran on desktops and servers, and it was awesome. Java Enterprise Edition added a lot of features geared towards big corporate IT needs, and almost all of it was terrible and led to a backlash and the term Plain Old Java Object or POJO.

Java Microedition or JME was Sun's attempt to scale down Java to run on phones, which at the time had very limited resources. This turned out to be a big mistake. Over the next 15 years the smartphone would become exponentially better, faster, and more capable. The progress in smartphone technology has been astonishingly fast. Today's smartphones are basically tiny versions of what desktop computers were in the start of this century. Sun's scaled down version of Java was essentially worthless and it didn't have to be.

Google took the opposite approach years later. Google realized that smartphones would be fast enough, have enough memory, have enough storage, have enough resolution to do the things that desktops could do. So Google started with Java Standard Edition and built it up rather than scaling it down. Android is essentially a version of Java SE that is customized for smartphones and tablets by adding support for all the hardware features in mobile computing like GPS, orientation, cameras, telephony functionality, etc.

The fact that Android is a managed platform and iOS is not is one of the reasons I consider Android to be vastly superior to iOS.

So, OS X and iOS are unmanaged platforms that run on top on an operating system, but what operating system?

The Operating Systems of Phones

There are multiple smartphone/PDA operating systems. You may have heard of Palm OS, one of the first for PDAs (hand-held computers that were popular in the 1990s). Palm OS was great for its time. Another OS that runs on smartphones is Windows CE. It was Microsoft's attempt to make the smartphone and tablet feel like a desktop. This turned out to be a mistake. Microsoft later made the opposite mistake by trying to turn the desktop into a tablet with Metro -- cringe.

You may be surprised to hear this, but your iPhone runs the same operating system as my Android. That operating system is called Unix.

Well, it's better to think of Unix as a family of operating systems. It used to be an operating system that was created by Bell Labs back in the 1960s, and it still has all the baggage from back then. But the code base has been forked, that is copied and diverged, so many times that there are many different versions of Unix, each best thought as a different operating system with a lot of common concepts, implementations, and history. For example, IBM has a version of UNIX called AIX. Sun Microsystems had Sun OS and later Solaris. Silicon Graphics, think of all those Pixar movies, has Irix. DEC had Ultrix.

And then one man, pissed off at how expensive UNIX was to license, created his own version named Linux. In 1991 Linus Torvalds created a royalty free version of UNIX that immediately became widely popular. By the late 1990s, Linux itself had been forked so many times that it itself was a family of operating systems called distributions or distros.

In the late 1990s when Apple realized it could not maintain its shitty operating system -- MAC OS was the operating system equivalent of Internet Explorer -- Apple decided to use Linux UNIX as the operating systems of all its products. This was the best decision Apple ever made, and damn they were lucky. If Linus Torvalds hadn't created Linux in 1991 and Linux hadn't become so popular, Apple would have died in the early 20th century. This is not an exaggeration.

Apple simply could not maintain its crappy OS. All other operating systems supported multimedia and multithreading and a host of other things. Apple couldn't even implement multithreading into its OS, and that was a necessity in all modern computing.

By adopting Linux, or if you prefer, BSD UNIX as its operating system of choice, Apple got everything for free. It could now do true pre-emptive multitasking with multithreading. It got much better multimedia support. And, most of all, Apple gained entry into every market including the critical smartphone market. Unix runs on smartphones. Apple's operating system, OS 9, did not and could not. For this reason alone, Apple should give half of its wealth to Linus Torvalds. He did far more for Apple than Steve Jobs ever did.

OS X, in all its versions, is simply an unmanaged platform that runs on top of Unix to provide a desktop shell and an unmanaged API, most of which is a pass-through to the UNIX API. That's not to say that these things aren't valuable, but OS X is not an operating system in any sense of the term. Its name is a marketing lie, a deliberate misnomer.

iOS, likewise, is an unmanaged platform that runs on top of UNIX on iPhones. Android is a managed platform that runs on top of UNIX on various smartphones and tablets. If you look at the file system on an iPhone and an Android phone, you'll see the same directory structures, operating system command-lets, and Unix's equivalent of Windows' DLLs which are called shared objects and have the .so extension.

[Side note: I recommend Easy File Explorer for managing your file system on Android machines. It was the very first app I installed on my tablet and on my phone. I don't know if there is an iOS equivalent.]

Most of the time when non-professionals are talking about whether they like MAC or Windows, iOS or Android, they are really just talking about the desktop shells offered on those platforms or OSes. This is a very shallow way of judging a car. Sure, when you buy a car, it's nice to have the color you like. It's also nice to have a reliable, fuel efficient engine and a transmission that isn't going to break down when your in the middle of nowhere. The OS is what's under the hood. The desktop shell is like the dashboard. You can replace the dashboard of a 1984 Ford Ares with something more modern and appealing, but you're still driving a 1984 Ford Ares.

[Side note: Three out of five Ford Ares drivers have fixed addresses.]

The Future

Personally, I'd like to see an entirely new, modern operating system created. I don't mean a new revision of that crappy old Unix from the 1960s. Yes, there are reasons I'm no fan of Unix, but they go into technical details that few people would be interested in. Nor do I want a better version of Windows as it has too much baggage as well. And OS X, iOS, and Androids are platforms, not operating systems. Yes, platforms are important, but so are operating systems.

Over the past 20 years, most operating system development hasn't really been OS development. It's been changing fashions of desktop shells. Let's remove the borders and chrome around windows. Let's snap windows to half or a quarter of the screen. Let's show all our apps as tiles in a list. Let's provide an app store as a single point of buying and installing apps. These aren't real operating system issues.

You might be thinking that the operating system is irrelevant now, and to some extent you would be rights. Sharing documents and interacting is now all done over the Web, which is essentially a distributed platform. Most of the things you do from sending emails and IM to reading news to banking is done in a browser that is largely independent of the underlying operating system.

However, I believe there is a third generation of operating systems that needs to be created. The first generation of OSes were basically bundles of I/O functions loosely placed into an API of sort. The second generation of operating systems provided hardware management in the form of UNIX in mainframes and Windows NT and OS/2 in desktops. They dealt with memory management, processes lifecycles, graphic adapters and gaming, and eventually multithreading and networking.

The third generation of operating systems should provide management of applications, communication, and content. I'm not talking about low-level functions already provided by OSes like talking over the network. I mean high level management of the ecosystem we now live in. A third generation OS would provide every application or service with its own environment and prevent one application from affecting another application's or the OS's environment. For example, no virus could write its code into another application's code. You could see every application on your system and there would be no place to hide because each application would have to be registered in order to even exist on the file system, nonetheless execute.

A third generation OS would also ensure that applications didn't send your credit card information across the network without your permission. Encryption would be provided by the OS for all network traffic. It would make sure applications didn't call home without your knowledge and consent. It would prevent applications from snooping on your private information like your contact lists and what other applications you have installed. All inter-application and inter-process communication would be done through OS channels. It would prevent key loggers and any other kind of spyware.

A third generation OS would provide full content management of your content from photos to videos to saved games in a way that would both simplify tasks and empower the end user. I'm not just talking about backing things up to the cloud or providing specific folders for different kinds of content. I'm talking about managing all your content, protecting it from loss or unauthorized viewing, encrypting it, copying it between machines, sharing it with other people, and even version controlling and face-tagging, and searching it by similar content.

There are a lot of improvements to make to the operating system layer. Why is it that we only get different color pixels every few years?

#environment

Comments 1 - 40 of 61       Last »     Search these comments

1   Strategist   2015 May 28, 7:01pm  

Dan8267 says

Apple has not made an operating system in this entire millennium.

They did not have to. The OS Apple made in the last millennium was designed for this millennium.
The stolen OS Microsoft made, still does not work.

2   FNWGMOBDVZXDNW   2015 May 28, 7:55pm  

I'm pretty sure that iOS has roots in BSD. This article is interesting. It says that Jobs tried to hire torvalds, and ended up hiring Hubbard.
http://www.wired.com/2013/08/jordan-hubbard/

3   Dan8267   2015 May 28, 8:19pm  

YesYNot says

It says that Jobs tried to hire torvalds

No surprise. All of Apple's success since 1999 has been based on Linux.

Strategist says

he OS Apple made in the last millennium was designed for this millennium.

Then why hasn't that OS existed since 1999?

4   Bellingham Bill   2015 May 28, 8:31pm  

by an individual because the entire microcomputer industry was boiled down to two players, Intel and Apple

You're forgetting Commodore and Atari here.

shows C64 outsold the Apple II line 1983-93 and Macs in the 80s.

They showed Steve Jobs the work of the great Douglas Engelbart.

Xerox had the money in the 1970s to hire the best computer R&D team ever assembled.

http://www.amazon.com/Dealers-Lightning-Xerox-PARC-Computer/dp/0887309895

^ read that for the saga of PARC, which pioneered the modern desktop GUI and ethernet networking, in addition to object-oriented programming.

As Jobs said in his mid-90s TV interview about PARC, he saw all three in his visits but the first was so mind-blowing to him that's all he could keep as a take-away.

And the Apple equivalent of the graphical user interface was a piece of shit compared to Engelbart's already completed version.

NSL/Augment had windowing and video but no actual GUI per se. That was PARC's innovation of the 70s.

The Mac Toolbox was certainly uneven and perhaps even crap in 1984-85 (I only got enthused by Macs once System 6 came out and the Mac II was released), but it had the singular advantage of existing on the market, while Windows was still years away and the Intel platform would only fully break away from character graphics in the early 90s once Windows 3 popularized the Mac's windowing paradigm, and even its "Toolbox" APIs to a significant extent.

The Intel chips were just better for desktop computing.

Nobody in their right mind would choose any pre-486 x86 architecture for PCs. IBM only did because they didn't want their PC platform competing with their real line of business, office computing.

The sole advantage of x86 in the 1980s was the world needed an open microcomputer standard and Microsoft had the rights to license the platform away from IBM, once the BIOS was clean-roomed by Compaq and Phoenix. Without Microsoft, Compaq would have just been another Osborne, but with Microsoft, the PC platform could evolve away from IBM's control, so much so that IBM threw in the towel completely on PCs last decade. It didn't hurt that Microsoft's MS-DOS was cheap, at least compared to CP/M etc.

The x86/MS-DOS/IBM BIOS platform started getting the apps,businesses wanted that to run, and growth accumulated growth as the world standardized on this platform via massive network effects. Still, 680x0 was a far superior architecture through 68030 vs 386.

himem.sys, dos extenders and all that x86 bullshit were artifacts of its deeply flawed design. Don't even get me started on the x86 byzantine register architecture vs. 680x0's beautiful 32 registers.

But a funny thing happened with 68040 vs 486, Intel fixed a lot of the flaws with x86 while Motorola lost the plot. My 16Mhz 68030 wasn't all that worse than the 68040s coming out 3-4 years later, while the 486 with its CPU multiplier (100Mhz CPU on a 25Mhz FSB) made the PCs increasingly powerful in the early 90s.

Second, the complex instructions were just quite frankly awesome and made a lot of things like playing and editing video possible.

No. MMX etc are pretty riscy actually. SIMD -- single instruction multiple data.

Even x86 is RISCy underneath the visible ISA, in the guts of the CPU where the work gets done. Without RISC it is too hard to pipeline instructions. Plus of course x86 as an ISA was horribly register-starved until AMD's x64 extensions, so in the 90s and last decade Intel resorted to tricks like register scoreboarding to get more performance.

XXM SSE2?

And Apple made the transitions to PCs right when multimedia was taking off. If it hadn't, the entire Apple experience would have ended.

Actually in the late 90s PPC added "Altivec" / VMX SIMD, turning the Apple's "G3" into the "G4". We were good.

In my opinion, managed platforms are vastly superior to unmanaged platforms, which are so 20th century.

I agree 100%. Which is why I really dig C#'s infrastructure. Of course, Microsoft started backing away towards unmanaged C++ a couple of years back due to performance issues.

The fact that Android is a managed platform and iOS is not is one of the reasons I consider Android to be vastly superior to iOS

This is largely an implementation detail, or would be if the Objective C 2.0 runtime's garbage collection were a lot more robust.

Also, Apple's lack of "CIL" in C#-speak (ISA-neutral intermediate language) is a minor hassle from time to time as developers must ship "fat" binaries that have both ISAs embedded in them. This is highly suboptimal space-wise, but does simply the runtime of course. Trivia: Apple patented fat binaries when moving from 68k to PPC!

but your iPhone runs the same operating system as my Android. That operating system is called Unix.

Actually the iOS kernel is still a Mach/BSD mishmash. It's not Linux at all.

In the late 1990s when Apple realized it could not maintain its shitty operating system -- MAC OS was the operating system equivalent of Internet Explorer -- Apple decided to use Linux as the operating systems of all its products.

The actual history is more interesting than that. For one, Apple bought NeXT in 1996 so MacOS was deadman walking well before the late 90s.

Earlier that year, they started:

http://en.wikipedia.org/wiki/MkLinux

but that was an evolutionary dead end to a large extent, as after the NeXT people took over OS development in 1997 they were going to rebase everything on NeXT's IP.

Classic MacOS could have been saved -- the company's "Carbon" cleansing of the APIs and later adoption of a PMT kernel in 8.6:

http://en.wikipedia.org/wiki/Mac_OS_8#Mac_OS_8.6

were great modernizations. The main problem was this was all C-based, and only squirrels and monkeys write full modern apps in C.

OS X, in all its versions, is simply an unmanaged platform that runs on top of Unix

Actually XNU, which is a Mach/BSD mishmash. BSD is more a personality / wrapper over the Mach kernel that is doing all the actual heavy lifting.

I'd like to see an entirely new, modern operating system created.

And with that, we can both agree. Clearly the rise of superfast SSDs is blurring the line between memory and files. This new OS will look at a single memory space that will exist in on the CPU(s) and GPU(s), in the on-chip caches, RAM, and storage.

I'm a big fan of managed environments and would like to think Microsoft's C# ecosystem has all the bases covered to pull this off. Not that the language itself is important, that all compiles down into the CLR anyway ; )

5   Dan8267   2015 May 28, 9:34pm  

Bellingham Bill says

You're forgetting Commodore and Atari here.

OK, two major players. Atari never took off and the Commodore, although beloved by some, had only a short-lived success.

Also, I have to question the numbers as the MAC column includes the Motorola MACs, the PowerPC, and Intel (PC) "MACs". All the MACS after Apple's switch to Intel should fall under the PC column, not that it would contribute much. Also, I don't think your table is including business computers and servers which make up a huge part of the market.

Bellingham Bill says

MMX etc are pretty riscy actually.

Nonsensical statement. The MMX and XMM instructions are part of the x86 instruction set. To take a subset of a CISC instruction set and call that subset RISC is meaningless. You call these instructions the exact same way you call any other one.

Bellingham Bill says

Actually in the late 90s PPC added "Altivec" / VMX SIMD, turning the Apple's "G3" into the "G4". We were good.

Well, that's an opinion I'll gladly disagree with.

Bellingham Bill says

The fact that Android is a managed platform and iOS is not is one of the reasons I consider Android to be vastly superior to iOS

This is largely an implementation detail, or would be if the Objective C 2.0 runtime's garbage collection were a lot more robust.

Garbage collection is only one service offered by managed platforms. There are many reasons to love managed platforms.

Bellingham Bill says

Clearly the rise of superfast SSDs is blurring the line between memory and files.

Unfortunately, no. Although SSDs are much faster than hard disks, SSDs are damn slow compared to volatile system memory. The fastest SSDs available today have transfer rates of about 550MB/s. Volatile system memory has a transfer rate of 15 GB/s for DDR3 and over 21 GB/s for DDR4. That's 30 to 40 times the speed, and that's just talking about sustained transfer rates. There's also latency, which is the time between the request for data and its fulfillment. Latency for DDR3 ram is less than a nanosecond, but upwards of 50,000 to 100,000 nanoseconds for SSD drives. You'd incur that latency cost every time an app tries to read a 32-bit integer from memory if you used SSD as system memory.

Maybe this difference will close, but the big question is can non-volatile memory ever be as fast as volatile memory. As long as we have to arrange atoms for non-volatile memory, it won't compete with volatile memory, that only has to push electrons, for speed. Electrons have way less mass than the molecules we have to move or orientate to store information persistently.

If SSDs could compete with volatile memory in terms of speed, databases could stop using b-trees or organizing tables as I/O would no longer be expensive. That would change a lot of things. Right now, we're a long way to replacing DDR memory with SSD and it's doubtful that SSD will ever be fast enough to justify using it as a replacement for volatile memory. Now using an SSD drive to hold your operating system's swap file for virtual memory is a whole different story. That can make your system run a lot faster.

6   hanera   2015 May 28, 9:49pm  

Re-writing history?

7   Dan8267   2015 May 28, 11:27pm  

hanera says

Re-writing history?

Not at all. Nor am I passing judgements except when I explicitly say so.

If you believe anything I've written to be factually incorrect, feel free to quote the alleged mistake and explain in detail why you think it is a mistake. More than likely, I will be able to show that you are incorrect by backing up the statement with references. I usually reference in my posts, but this one I wrote quickly based on my memory living through the history of modern computing. Nonetheless, I believe you'll be hard press to find any significant error in my account, but feel free to try. I have no problem correcting a mistake if one is shown to be.

8   bob2356   2015 May 29, 7:02am  

Dan8267 says

On the Apple side, Apple made computers like the Apple I and Apple ][ and eventually the Lisa, which quite frankly were armature hobbyist quality.

The apple II was a very popular business machine. It was produced 16 years from 1977 to 1993. Visacalc was the go to program that sold lots of these machines. A plug in z80 board to allow cpm to run was really popular. Then you could run wordstar and dbase II. I saw tons of these things in offices in the 80's. They ran a lot better than any microsoft os until windows 98 iteration of dos. Not a hobbyist machine at all.

The lisa was also a business machine with the first graphical interface interface at apple. Lisa was more advanced and sophisticated than the mac including protected memory (not seen again at apple until 0S X in 2001) and cooperative multitasking which the mac lacked. The 5mz 68000 just wasn't fast enough for the os. It also had some very odd features. http://www.macworld.com/article/2026544/the-little-known-apple-lisa-five-quirks-and-oddities.html The machine was both sluggish and very expensive. Not surprisingly it sold poorly. The last model of the lisa was reworked and sold as the macintosh xl. The 10k price tag, about 25k today, wasn't for hobbyists.

Bellingham Bill says

. It didn't hurt that Microsoft's MS-DOS was cheap, at least compared to CP/M etc.

The difference is cp/m worked.

Dan8267 says

In Windows, there used to be an API called Win16. That's now retired. Today Windows offers two distinct APIs called Win32 and Win64, which are 32-bit and 64-bit versions.

Windows 3.1 and earlier were not operating systems, but rather graphical shells that ran on top of the operating system called DOS.

Actually windows up through ME ran at least with dos. The architecture was pretty mixed bag starting with win95. There has been endless debate on the point. DOS was such an unmanagible poorly designed system that gates hired dave cutler and a good chunk of his team away from dec to create an os that worked. Cutter's vax/vms was considered one of the best os's for the era. Cutler designed NT, with an awful lot of vax/vms in it, giving microsoft a modern reliable (for microsoft) os. Microsoft then spent the next 12 years painfully slowly phasing out the abortion known as ms/dos.

NT was written on be portable. It was initially developed on a mips r3000 then ported to x86 so that ms guys didn't cheat and slip in x86 code. http://windowsitpro.com/windows-client/windows-nt-and-vms-rest-story

9   Dan8267   2015 May 29, 8:54am  

Windows had more than one line of development. Windows NT was a true operating system like OS/2, shared much of the code with OS/2, and came out long before Windows ME. Windows 95 was derived from Windows 3.1, but it did generation 2 things like process management whereas the versions of Windows before 95 did not.

10   MisdemeanorRebel   2015 May 29, 9:37am  

Apple II was a piece of crap. I said "Damn I'm lucky I have a C64 at home, the graphics and sound and memory usage is magnitudes better".

The C64 was the greatest selling computer of all time. The PET was the first "plug in and use immediately" computer. Apple I was just a board in a case, you could not just "plug it in" and start experimenting. Easy Calc, the first killer App, only got written on Apple first because there was a 9 month waiting list for PETs, whereas Apples gathered dust on shelves. What killed the C64 was the Sterling's removal of the amazing Jack "Attack" Tamriel, and the horrible marketing of the Amiga.

11   Bellingham Bill   2015 May 29, 9:52am  

Apple II's main advantage was that it was semi-decent across the board with no major Achilles heel (for the time).

Its biggest advantage was simply Woz's 143k disk drive. That was bringing a gun to a knife fight in the late 70s / early 80s until Sony invented the 3.5" floppy
at least.

The A2's graphics were squirrelly, but genius programmers could get good-enough game performance out it.

II+/IIe were too expensive for my parents to just give me but my high school let me borrow "my" machine (from the lab) over the '83 winter break and '84 summer break.

Plus EVERY. LAST. GAME. the lab had -- about 100.

I had a good time with that, LOL.

Apple was very lucky to survive on the A2 platform so long, after the A3, Lisa, and Mac 128K misfires. They shoulda gone with 68K (like the Amiga 500) with A2 compatibilty card option ca. '83, but they got the Mac out as quick as they could I guess, bugs and all.

12   Bellingham Bill   2015 May 29, 9:57am  

Dan8267 says

but it did generation 2 things like process management whereas the versions of Windows before 95 did not.

Windows 95 would still take a lock on 16-bit code, so all the process management didn't really mean much when the 16-bit side locked up.

The symptom of that was the start button going black. After that, a reboot was your only option.

NT didn't have this problem of course.

13   Bellingham Bill   2015 May 29, 10:08am  

Dan8267 says

you believe anything I've written to be factually incorrect

your big misapprehension is thinking Linus invented Unix. His stuff was a side-show in the 90s.

Linux was first announced in 1991, a year after TBL started working on his www idea on NeXT

I don't think the NeXT team got a time machine and grabbed Linus' stuff then went back to the the 80s . . .

And it is possible to have the Unix API / user environment while running on another OS.

Something I wish Microsoft would do for NT. Their shell stuff is total 'galapagos mentality' junk.

14   Dan8267   2015 May 29, 10:27am  

Bellingham Bill says

Windows 95 would still take a lock on 16-bit code, so all the process management didn't really mean much when the 16-bit side locked up.

It still qualifies as an operating system. Windows 3.1 was clearly just a shell.

15   Bellingham Bill   2015 May 29, 10:41am  

Anything that can lock itself up on lower level code is not an OS.

Windows 95/98, (n): 32 bit extension and a graphical shell for a 16 bit patch to an 8 bit operating system originally coded for a 4 bit microprossessor, written by a 2 bit company that can't stand 1 bit of competition.

16   Dan8267   2015 May 29, 10:45am  

Bellingham Bill says

your big misapprehension is thinking Linus invented Unix.

I never said anything remotely like that.

I said

In 1991 Linus Torvalds created a royalty free version of UNIX that immediately became widely popular. By the late 1990s, Linux itself had been forked so many times that it itself was a family of operating systems called distributions or distros.

Linus created Linux, not Unix. I don't think I was at all unclear on this, nor was I factually incorrect.

Bellingham Bill says

I don't think the NeXT team got a time machine and grabbed Linus' stuff then went back to the the 80s . . .

I don't think I was implying that, despite this being 2015, the year Marty McFly traveled to. However, Linux was open source from the beginning. People were encouraged to copy, share, and fork the code. I would be extremely shocked if Apple, a company with a very long and deep history of copying other people's intellectual property, would not take entire chunks of functionality from the range of Linux distros whenever it was cheaper, faster, and easier to incorporate some useful functionality by doing so.

In any case, my point was that by making a royalty-free, open source version of Unix, Linus Torvalds created a huge market space that resulted in smartphones per-dominantly running Unix today, which opened the opportunity for the iPhone and huge profits for Apple. I'd even go as far as to say the iPod and other small devices are largely influence by the creation of Linux.

I'm not saying that I actually like Linux, but there is no denying it's importance in history.

Bellingham Bill says

And it is possible to have the Unix API / user environment while running on another OS.

That's true for any OS. APIs can be ported. User environments mocked. OSes can run as guests in virtual machines.

The Intel x86 architecture is widely successful because of its virtualization support, which wasn't even a planned industry. The 80386 CPU has a virtual 80286 CPU as a backwards compatibility strategy that allowed the 386 chip to run programs compiled for the 286 chip. The virtualization industry, which now includes cloud computing, grew out of that design decision.

Bellingham Bill says

Something I wish Microsoft would do for NT. Their shell stuff is total 'galapagos mentality' junk.

You can install third party desktop shells and command shells for Windows and have been able to do this for years. There's not much of a market for them though on the Windows side; Linux has so many more desktop shell options. I think that's mostly a cultural thing. Linux users tend to be only tech people. Windows users run the full gamut from tech gurus to old grandmas.

17   SJ   2015 May 29, 10:50am  

I miss my old Commodore 64! It was my first computer and got me into programming in high school through college. Bulletproof, fun lots of options and gaming was incredible at the time.

18   Dan8267   2015 May 29, 10:50am  

Bellingham Bill says

Windows 95/98, (n): 32 bit extension and a graphical shell for a 16 bit patch to an 8 bit operating system originally coded for a 4 bit microprossessor, written by a 2 bit company that can't stand 1 bit of competition.

Even by that definition, Windows 95 and 98 are operating systems. The bit size of the APIs and memory space are irrelevant.

Whether or not you like a product is an opinion. Whether or not that product is an operating system is factual. A Ford Pinto may be a shit car, but it's still a car, not a banana.

As for my experience, I was highly productive running Windows 95 OSR2 and Windows 98 for about five years. Maybe you needed to do better system administration.

19   MisdemeanorRebel   2015 May 29, 10:56am  

SJ says

I miss my old Commodore 64! It was my first computer and got me into programming in high school through college. Bulletproof, fun lots of options and gaming was incredible at the time.

Absolutely! Way ahead of it's time. Too bad the follow on Commodore Products were launched and managed so badly.

Did you know the 1541 was gimped to be Vic-20 compatible? It was actually at least 10x faster than it was.

20   Heraclitusstudent   2015 May 29, 11:34am  

Bellingham Bill says

In the late 1990s when Apple realized it could not maintain its shitty operating system -- MAC OS was the operating system equivalent of Internet Explorer -- Apple decided to use Linux as the operating systems of all its products.

The actual history is more interesting than that. For one, Apple bought NeXT in 1996 so MacOS was deadman walking well before the late 90s.

Earlier that year, they started:

http://en.wikipedia.org/wiki/MkLinux

but that was an evolutionary dead end to a large extent, as after the NeXT people took over OS development in 1997 they were going to rebase everything on NeXT's IP.

Apple never used Linux. They used BSD, an other version of Unix.
http://en.wikipedia.org/wiki/File:Unix_history-simple.svg

MkLinux was never intended to be an Apple OS, it was just an effort to port Linux on the PowerPC computers.

21   Bellingham Bill   2015 May 29, 11:54am  

Dan8267 says

I don't think I was implying that

"Apple decided to use Linux as the operating systems of all its products"

22   Bellingham Bill   2015 May 29, 11:55am  

Heraclitusstudent says

it was just an effort to port Linux on the PowerPC computers

during this period Apple engineers, using the term loosely, were monkeying around with various approaches, like maybe even using NT's PPC kernel as the OS with "MacOS" API on top of that.

23   Bellingham Bill   2015 May 29, 11:58am  

thunderlips11 says

It was actually at least 10x faster than it was.

Actually there was a bug that they never fixed that forced the slower speed.

http://www.lemon64.com/forum/viewtopic.php?t=52439&sid=660bd9bc1f311f2aa17f8504670ff8dd

One look at my friends having to type Load * 8,1,1 or whatever put me off of the C-64 permanently.

The funny thing is we use SATA and USB now, both serial interfaces. Commodore was ahead of its time!

24   Dan8267   2015 May 29, 12:17pm  

Heraclitusstudent says

Apple never used Linux. They used BSD, an other version of Unix.

I stand corrected.

In either case though, it remains one of the best decisions Apple ever made as continuing MAC OS would have been disastrous. Furthermore, by converting to Unix, Apple by luck of timing was prepared to enter the emerging MP3 player and smartphone player market, which they wouldn't have been if they stuck with MAC OS.

I also stand by the statement that Linux was largely responsible for the acceptance of Unix in general on small portable devices including smartphones. Getting an operating system for free to use in your commercial product is a tremendous financial motivation and that lead to the recent revival of Unix on everything from smartphones to servers.

25   Dan8267   2015 May 29, 12:23pm  

Bellingham Bill says

Dan8267 says

I don't think I was implying that

"Apple decided to use Linux as the operating systems of all its products"

I was wrong about MAC OS being based on Linux. I should have said BSD Unix. Yes, even my memory is imperfect. One confusion over the tons of Unix forks is understandable.

However, I still don't think I was implying that Linus Torvalds invented Unix, which is what I thought you meant by

your big misapprehension is thinking Linus invented Unix. His stuff was a side-show in the 90s.

Linux was first announced in 1991, a year after TBL started working on his www idea on NeXT

I don't think the NeXT team got a time machine and grabbed Linus' stuff then went back to the the 80s

26   hanera   2015 May 29, 1:21pm  

Dan8267 says

I would be extremely shocked if Apple, a company with a very long and deep history of copying other people's intellectual property, would not take entire chunks of functionality from the range of Linux distros whenever it was cheaper, faster, and easier to incorporate some useful functionality by doing so.

In any case, my point was that by making a royalty-free, open source version of Unix, Linus Torvalds created a huge market space that resulted in smartphones per-dominantly running Unix today, which opened the opportunity for the iPhone and huge profits for Apple. I'd even go as far as to say the iPod and other small devices are largely influence by the creation of Linux.

Pure conjecture. So much hatred? Lucky for me I don't have any emotional baggage to hate Apple, switch from a Windows user to a Mac user in 1997 and buy AAPL for many years. Ahem... lucky choice or willing to change views when the tide change, up to you to opine.

27   Dan8267   2015 May 29, 1:45pm  

hanera says

Pure conjecture.

No. Simply empirical facts.

hanera says

So much hatred?

Like most Americans, you make the false assumption that everybody either has to be on your team or the polar opposite. Life isn't like that.

Both Apple and Microsoft, and quite a few other players in the 1980s, have stolen a lot of ideas from each other. How is pointing out a historical fact hatred? If you state that America practiced slavery for nearly 100 years, does that mean you hate America? Is whitewashing history a form of patriotism? Why should I whitewash the history of computing?

hanera says

Lucky for me I don't have any emotional baggage to hate Apple

One cannot love or hate corporations, countries, or other imaginary designations based on pieces of paper. You cheapen the words love and hate by using them to describe such trivial things.

Furthermore, you entirely missed the point of this thread, which is to inform, not persuade.

28   hanera   2015 May 29, 2:03pm  

If you're informing,why so many ad hominem attacks? And pure conjecture? And so many opinionated assertions?

From comment 26, examples of emotional charged opinions are: "extremely shocked", and "even go as far as to say".

29   Bellingham Bill   2015 May 29, 2:16pm  

Dan8267 says

continuing MAC OS would have been disastrous. Furthermore, by converting to Unix, Apple by luck of timing was prepared to enter the emerging MP3 player

you're kind talking out of school here . . . whatever embedded OS the PortalPlayer chipset used on the first iPods, it wasn't running Unix!

NeXT/Apple's winning approach was a) Tevanian's Mach kernel, which served as the HAL + b) 4.3BSD API for processes etc and CLI tooling + c) OO Objective-C APIs to abstract all the C nastiness going on at the OS layer + d) visual design tools to put together apps from UI components (buttons, menus, textviews. etc) faster.

NeXT's kernel was called XNU, which stands for . . . Xnu's not Unix!

acceptance of Unix in general on small portable devices including smartphones.

nah, Sharp released the Zaurus in 2002, based on Unix, and that didn't go anywhere.

Unix is an implementation detail with Android. If they had chosen QNX or VxWorks instead they'd have been equally as successful.

VxWorks actually performed an interplanetary patch operation back in 1997, that was impressive.

http://space.stackexchange.com/questions/9178/how-did-nasa-remotely-fix-the-code-on-the-mars-pathfinder

Linux, and Unix in general is just like McDonald's, Walmart, or Microsoft.

They got big for real reasons, but these reasons are not necessarily 100% goodness and light, sometimes just contingency, accident, and network effects.

Unix got a good start with C, BSD freeing the code from AT&T's control, the GNU project, and UNIX's mindset of a lot of little pieces working together sequentially is better than hefty things working alone.

It's just code, all the way down, man.

30   Dan8267   2015 May 29, 2:28pm  

hanera says

ad hominem attacks

An ad hominem attack, by definition, can only be levied against a person, not a company. But if you meant expression of negative opinions... Well, in the entire original post, I expressed only three negative opinions about Apple products.

1. OS 9 was a piece of shit.
Most Apple users agree with this and wetted their pants with excitement over OS X. But even in that opinion, my point was that Apple was highly motivated to abandon an operating system they simply could not continue to maintain. It was time for a break.

2. The Apple I and Apple ][ and Lisa were armature hobbyist quality.
I stand by that opinion. You are free to disagree with it, but I see no reason to inject a little bit of opinion when making a long talk about the sometimes beautiful, sometimes ugly, always interesting history of the rise of modern computing.

3. Apple's imitation of Douglas Engelbart's work was not as good as Engelbart's.
OK, you could take this as a negative judgement on Apple or as respect for the great Douglas Engelbart. I've often said that Microsoft's development model is to make a shitty copy of an existing product and keep refining it until it becomes good five or more years later.

So at most three negative opinions expressed at Apple in the original post. That hardly constitutes Apple bashing, especially given that these opinions are well-founded. Only a fanboy holds the position that no criticism of an entity is acceptable. Are you a fanboy?

hanera says

From comment 26, examples of emotional charged opinions are: "extremely shocked", and "even go as far as to say".

Dan8267 says

However, Linux was open source from the beginning. People were encouraged to copy, share, and fork the code. I would be extremely shocked if Apple, a company with a very long and deep history of copying other people's intellectual property, would not take entire chunks of functionality from the range of Linux distros whenever it was cheaper, faster, and easier to incorporate some useful functionality by doing so.

How is that an attack on Apple. Are you saying that no developer at Apple ever looked up code on the Internet, said "hmmm, that solves the problem I'm working on", and then incorporated the code with or without modifications? Unless Apple had the very dangerous "not invented here" syndrome, it would be foolish to re-invent the wheel. It is also astronomically improvable that Apple, working on its own version of Unix, would not leverage the open-source Linux in any way. In fact, it would be damn stupid if they didn't.

Dan8267 says

I'd even go as far as to say the iPod and other small devices are largely influence by the creation of Linux.

Again, how is this an attack of any sort, nonetheless an unfounded attack? Do you have such a negative opinion of Linux that the mere implication that it has influenced a product is some form of denigration?

Again, you are missing the point of this entire thread. Although I have sprinkled a few opinions in my recalling of history, as any human being would (that's a large part of what makes us human and our history interesting), my accounting is almost entirely factual and informative exposition without judgement. And in the few places where I expressed judgement, the purpose was to illustrate the motivation of the players to change their products such as Apple wanting to move to a different, proven operating system.

Do you really think I would never say anything bad about Microsoft? Obviously you haven't paid attention to my previous tech posts. I have no problem praising and condemning the same company for its good work and its bad. I don't engage in the bullshit arbitrary cultural wars that dominate most people's thinking. In fact, the entire purpose of this thread was to steer people away from such cultural wars by giving them a better and correct understanding of what operating systems and related concepts are and how they work.

31   Dan8267   2015 May 29, 2:41pm  

Bellingham Bill says

you're kind talking out of school here . . . whatever embedded OS the PortalPlayer chipset used on the first iPods, it wasn't running Unix!

Once again you are reading incorrect things into my words. In the statement

I'd even go as far as to say the iPod and other small devices are largely influence by the creation of Linux.

I was discussing the influence of Linux on the emerging portable computing market, not describing individual product implementation.

Forget about technology for a moment and think of economics. Without Linux, companies developing MP3 players, phones, and tablets would not have a clear OS to choose for their product. The market would be fragmented. But because of the popularity and royalty-free licensing Linux brought to Unix in general, hardware manufacturers could build their hardware to run Unix. Yes, there are lots of versions of Unix, but they are all closely related and have a lot of overlap. Hardware manufacturers could confidently build new hardware without the worry that it would not be compatible with the predominate operating system. This created the opportunity for rapid advancement in what kinds of things small portable devices could do such as handle GPS, know it's orientation, have cameras, proximity detectors, etc.

In this way, Linux influenced the entire portable computing market even for non-Unix based devices. Linux made Unix far more popular.

Bellingham Bill says

nah, Sharp released the Zaurus in 2002, based on Unix, and that didn't go anywhere.

In every period of great advancement, there are tens, hundreds, even thousands of losers for every winner. Pets.com didn't go anywhere, but that doesn't mean the whole Internet thing was a bust.

Bellingham Bill says

It's just code, all the way down, man.

Of course everything's code all the way down to the hardware. But code matters. Code can be of excellent or terrible quality. And it's not just the code itself. It's the architecture, the design, and the concepts behind the code, not just the expressions written in some programming language.

However, I don't want this threat to turn into an argument over which OS is best. We can do that in another thread. I intended this thread to correct the predominate misunderstanding that the pixels on the screen is the OS, that the desktop shell is the operating system rather than simply a replaceable interface to the OS.

32   Bellingham Bill   2015 May 29, 2:43pm  

As for continuing with MacOS in the 1990s being a disaster, I dunno.

MacOS was actually aggressively modernized 1997-2000 while the NeXT people rewrote and rearchitected OS X for its debut.

OS X was only the default boot OS in 2002, after 5+ years of development at Apple.

And in the end the compatibility story was the "BlueBox" which ran pre-existing "MacOS" binaries in a virtual machine, with decent interop with the OS X windowing manager to hide the hack.

Carbon:

http://en.wikipedia.org/wiki/Carbon_%28API%29

the modernized MacOS C API that supported PMT, and ran as a first-class citizen on OS X (e.g. Photoshop, Excel, the Finder itself, etc) only became deprecated in 2007.

I had the opportunity to see the sausage as it was being made in the middle of this process, and I don't see Unix's brilliance here.

gcc was abandoned as crap, X11 was never adopted, POSIX could have been provided like how Microsoft wrote WinSock for NT.

Apple would be irrelevant now but for the iPod and iPhone, and the success of these didn't have anything to do with Unix.

The iPod succeeded because Apple cleverly bought all the 1.8" hard drives Toshiba could manufacture, enabling Apple to corner the market on gigabyte PMPs that could fit in a pocket (until flash got cheap enough). That and Apple doing the heavy lifting getting the iTMS off the ground in 2003, at $1 per song.

The iPhone succeeded for several reasons. Foremost because the telcos were shipping truly horrible product and Nokia just couldn't quite put all the right pieces together.

They had OMAP, which was equivalent to iPhone 1's hardware, but didn't have the API design or the GL drivers working well, while the iPhone had a fully performant PowerVR renderer doing all the graphics updates, along with the CoreAnimation API that made it possible for programmers to make the dynamic, animated UE that users loved.

iPhone's biggest wins were losing the stylus, losing extraneous phone-related buttons, cursor keys, and all that crap, which enabled them to go QVGA at 160DPI, good enough to support a real web browser on the device.

33   Bellingham Bill   2015 May 29, 2:44pm  

Dan8267 says

Without Linux, companies developing MP3 players, phones, and tablets would not have a clear OS to choose for their product.

Why the #*%) does a mp3 player need Unix???

The NeXT people running Apple in 2000-2001 had been elbows-deep in Unix since 1986 with NeXT and earlier in school etc.

They didn't need Unix for the iPod, they found the chipset they needed with PortalPlayer, iterated on the final design and UI for some number of months, and shipped the product forthwith.

34   Dan8267   2015 May 29, 3:01pm  

Bellingham Bill says

Why the #*%) does a mp3 player need Unix???

Why does a phone need to? It doesn't. But eventually a product is expanded and does more things. First it plays back MP3s, then it records voice, then it plays short videos as well, then it takes pictures, then videos, then it sends email...

The first hardware MP3 player was the Diamond Rio, which came out in 1998. Today I play music on my smartphone, which really shouldn't even be called a phone. It's really a hand-held computer that has telephony services as one of its many functions. There's a big difference between the Rio and a modern smartphone.

One can even argue that the only hand-held device you need today is what we call a smartphone. Why carry a separate music player, a separate voice recorder, a separate digital camera, a separate portable gaming console, or even a separate watch? I stopped wearing a watch years ago. Personally, I rather carry one device than several, and the smartphone is a great platform for doing all those activities. Of course, with that complexity, you need an operating system. I'd argue that having a platform, particularly a managed one, as well is a very good idea.

35   Dan8267   2015 May 29, 3:03pm  

Bellingham Bill says

As for continuing with MacOS in the 1990s being a disaster, I dunno.

MacOS was actually aggressively modernized 1997-2000 while the NeXT people rewrote and rearchitected OS X for its debut.

Ah, but my point is that even doing that ends MAC OS and starts a new operating system, even if it is still marketed under the same name. It's an entirely different beast.

36   Bellingham Bill   2015 May 29, 4:17pm  

Dan8267 says

even if it is still marketed under the same name. It's an entirely different beast.

http://en.wikipedia.org/wiki/Ship_of_Theseus

37   Bellingham Bill   2015 May 29, 4:33pm  

Dan8267 says

Why does a phone need to? It doesn't.

Actually, a "smart" phone needs "powerful" application frameworks to construct the UX of the apps the user is using on the phone.

Application frameworks don't exist in a vacuum, as a platform they have technical underpinnings that exercise lower API levels like POSIX or MS-DOS.

The first iPhone had a 400MHz 32-bit RISC ARM processor (gee, wonder why they didn't choose x86), 128MB of RAM, and 8GB+ of flash storage, plus PowerVR rendering engine.

This was roughly equivalent to the iMacs Apple was shipping in the late 90s, which was also conveniently the minimum spec of what OS X was supposed to support on its first iterations.

This ARM hardware (essentially OMAP) had no proper well-developed OS (just look at the crap Nokia shipped running on it) in 2005, so Apple could not buy an off-the-shelf OS solution, they had to develop it in-house.

With Mach as the HAL, they could get XNU up, and once XNU was up they could look at what existing desktop APIs they needed to get working for the apps they wanted to run on the phone.

Curiously, they did take the opportunity to redesign a lot of the existing AppKit framework, so much that they shipped it named "UIKit" instead. But most of the existing OS X platform came over, they didn't have the time or risk tolerance to reinvent any wheels here.

38   EBGuy   2015 May 29, 4:35pm  

You guys are loads of fun. Do iOS and OS X run on the same operating system? Discuss.

39   Dan8267   2015 May 29, 6:58pm  

iOS and OS X are both variants of Unix with a graphical shell on top. Are any two customized versions of UNIX the same operating system? Is Lutheran and Catholicism the same religion because they are both Christianity?

Like I said in the original post, Unix is best thought of today as a family of operating systems. There is no clear line that distinguishes one species from another. I would consider iOS to be materially different from OS X as it's customize to run on phones.

40   justme   2015 May 29, 11:44pm  

Dan8267 says

Now using PCs as the basis for their products was a great improvement. Intel released the MMX and later XXM instruction set, which are CPU instructions that deal with massively parallel arithmetic operations designed to support multimedia.

"their products" == apple products.

You make it sound like Apple switched to x86 BEFORE Intel came out with MMX. That is not correct. MMX was introduced in 1997. Apple announced their first Intel/x86-based machines in Jan. 2006. And there is no such thing as "XXM". I think you mean SSE.

I enjoyed your history of personal computing and operating system, and it is largely correct, but some of the details are not quite right. Another case in point is what you say about PowerPC.

Dan8267 says

First, the Intel chips were always clocked faster than the PowerPC chips, completely defeating the purpose of the entire RISC architecture.

Not true at all. The first several generations of PowerPC in the 1980s-1990s were far ahead of Intel, performance-wise and at times also clock-wise. Besides, the point of RISC is not necessarily to achieve the highest clock frequency. It is to achieve better throughput by simplifying the instruction pipeline, often making it shorter than in a CISC implementation. In other words, Clocks per instruction (CPI) also matters. This is why AMD was creaming Intel on performance in 1999 or so. Intel was excessively pipelining their circuits to drive up the clock frequency (good for marketing), but it did not pay off on the CPI measure.

Back to PowerPC: PowerPC existed long before (and after) it became Apple's replacement for 68k procesors from Motorola. Pricing was not good, though. Around year 2003, Intel started pulling away because their wafer technology (device technology, lithography, etc) became better cost/performance than IBM and Motorola. That is one of the main reasons that Apple made the switch to Intel in 2005-2006.

Comments 1 - 40 of 61       Last »     Search these comments

Please register to comment:

api   best comments   contact   latest images   memes   one year ago   users   suggestions   gaiste