So, why did the C128 fail ?

Started by Blacklord, December 29, 2007, 11:41 AM

Previous topic - Next topic

0 Members and 4 Guests are viewing this topic.

airship

I've always heard '2 million' as the top end for C128 sales. What's the highest serial number you've seen?
Serving up content-free posts on the Interwebs since 1983.
History of INFO Magazine

nikoniko

You guys at INFO reported 2 million in Issue 22, September '88. Do you think C= had any meaningful sales after that point? Someone who worked on the Wikipedia article gives 4 million units total (without any source reference), which sounds kinda high. But the DCRs were a really good deal when they came out, and an even better deal when retailers were selling them for practically nothing, so the early 90s probably saw some movement. I could believe 3 million units.

airship

The problem with that '2 million' number is that is how many Commodore claimed to have shipped from the warehouse by then. All of the units sold retail after that could have been included in that number, which might have been optimistic to begin with. CBM were not known for their tendency towards veracity. That's why I'm interested in actual serial numbers out in the field.
Serving up content-free posts on the Interwebs since 1983.
History of INFO Magazine

nikoniko

Hmm. How were serial numbers handled with the introduction of the D, and later the DCR? Did each model start over at 1?

BigDumbDinosaur

"So, why did the C128 fail?"

What is meant by "fail?"  The unit did sell in surprisingly large numbers, so in that regard, I don't think there was any kind of failure.  The 128 delivered as promised (a few teething bugs notwithstanding), so technically speaking, it wasn't a failure.

The real failure was in Commodore management.  CBM's efforts were being directed in too many directions, lacked any kind of focus, and, as pointed out earlier, got caught up in producing useless technology (e.g., the C16).  Had they stayed focused on one or two product lines, the C128 may well have taken over the 64's lead.

I always felt that including CP/M capability in the 128 was pointless.  As noted above, by the time the 128 was available for purchase, CP/M was rapidly being shoved aside by MS-DOS, whose similarity to CP/M was sufficient to assure success.  CP/M was already dragging around a lot of Z-80 baggage and was largely obsolete by the mid-1980's.  Given that, why bother with what was quickly turning into a moribund operating system?  Using the large number of CP/M titles as an excuse didn't wash, as a lot of that software was from another era when microcomputers were much less capable.  For example, hi-res graphic support in CP/M was next to non-existent, so what good was it with a machine like the 128 that had hi-res capabilities?  In the case of the 128, giving it CP/M capability was nothing more than pouring some moldy old wine into a new bottle.

Having said that, I suppose if CBM had worked much harder to promote the 128's CP/M capability and had made CP/M more accessible to the uninitiated, they might have enjoyed some success in that area.  Again, however, CP/M was already a historical artifact in 1985 and it could be that during the time period from inception to actuality, CBM realized that they had expended too much effort on the 128's CP/M personality and thus didn't feel compelled to follow through in the support area.

As an aside, perhaps if someone other than Von Ertwine had been commissioned to make the CP/M port it would have been a better performing package and might have succeeded (CP/M on the 128 was pitiful in performance compared to native Z-80 powered machines).  I and others who were heaviliy involved with 128 software development in the latter 1980's felt that Von Ertwine's port was bloated and poorly executed (the business of switching between processors to perform I/O was downright dumb, although some of the blame for that rests with the hardware design -- again, why bother with the Z-80?).  My opinion is that the 128 would have been much better served by forgetting CP/M and focusing exclusively on its native mode capabilities, especially in the area of hi-res graphics on the 80 column side.

I maintain that the C-64 compatibility mode was ill-advised and totally unnecessary.  If the focus on the 128 had been solely on it being a 128 and not all things to all men, the technology would have been far more powerful, developers would have been more inclined to develop for it, more machines would have been sold, and a self-sustaining "economy" would have arisen like that which embraced the C64.

The few commercial titles that were written to the 128's native mode personality were pleasant to use and generally worked well.  For example, I used SuperScript 128 for some 7 years and found it more than adequate for the sort of word processing I did in those days.  Once I figured out exactly how SS 128 ran in the 128 architecture (the program itself runs in RAM bank 1 and the document being edited is loaded into RAM 0), I was able to capture it to my Lt. Kernal system as a type 5 contiguous binary file, patch the startup code to bypass the disk security check, and modify the file I/O routines to directly interact with the LK's SCSI disk access primitives, resulting in what was arguably the world's fastest C-128 word processing environment.  ;)  Although I never tinkered with it very much, I also found a lot of capability in BASIC 8 (the documentation that came with it left a lot to be desired, however).

Regarding the occasionally-maligned VDC, it was more than adequate in the context of the 128, and generally reflected the chip-making technology of the time, especially as embodied in the CSG's capabilities.  Actually, the 8568 version found in the 128DCR was more technically capable than the 8563 (more adaptable to various display configurations), and did have an unused interrupt capability, which was tied to the status register's bit 7.  Obviously, someone somewhere at either CSG or Commodore itself thought that an IRQ-driven 80 column display might be something worth exploring, although, as we all know, that too was never developed.  Incidentally, I never saw the lack of sprites as a weakness, since I was not (and still am not) into game playing on computers.  I viewed sprites as part of the "toy personality" of the C-64.

It is true that the VDC is/was somewhat obtuse to control.  However, it was no more difficult to work with than other CRT controllers of the era, such as the 6545e, on whose architecture the 8568 controller was based.  The VDC had a huge advantage over the VIC in that it did not need to usurp bus cycles to do its job, which meant that even in slow mode, the 128 would run faster if only the 80 column side was in use.  In fast mode with %00000011 written into the VIC's register $30 (which also kills raster interrupts -- don't try it from the keyboard), the system ran a good 2.2 times faster than it would in 1 MHz mode with video going to the VIC.  So, all-in-all, the VDC was okay -- a little hard to work with until the programmer had a clear understanding of what it was doing, but certainly more capable in most regards than the VIC.

One place where CBM defintely dropped the ball was in I/O capability.  Instead of kludging the lame serial bus with the "fast" routines, true IEEE-488 capability should have been native, and not the slow IEEE-488 of the old PET/CBM line (which only ran at 1.2 KB/sec, a fraction of what IEEE-488 was capable of).  A real RS-232 port should have been present, the I/O block should have been more finely decoded to allow more devices to share that part of the memory map and a Centronics port should have been available to provide native support to mainstream printers.

Regarding BASIC 7.0, I didn't do much development in it because it was too slow to do what I wanted.  Also, despite the greatly extended language offered in the 128 mode, it still was lacking in a number of hardware support areas.  For example, why wasn't TI and TI$ tied to one of the TOD clocks instead of the unreliable IRQ-jiffy "clock?"  Why weren't there any keywords to manipulate the user port lines, instead of having to PEEK and POKE?  Where was the support for direct control of the VDC (rather than having to go through the screen editor ROM)?

All-in-all, the 128 delivered no more or less than promised.  It simply didn't deliver enough to make it a clear choice over anything else available at the time.  Running in 64 mode, it did little more than a real 64, but at a much higher cost.  Running as a 128, it didn't have enough processing speed or capacity to make it a viable choice against the PC architecture of the time, nor did it have anything to offer against the Amiga (which was technically superior to everything else at the time).  In CP/M...well as a CP/M machine it made a pretty good door stop.  :-)
x86?  We ain't got no x86.  We don't need no stinking x86!

nikoniko

Quote from: bigdumbdinosaurActually, the 8568 version found in the 128DCR was more technically capable than the 8563 (more adaptable to various display configurations), and did have an unused interrupt capability, which was tied to the status register's bit 7.
Would you care to elaborate on that interrupt capability? Sounds interesting.

QuoteIncidentally, I never saw the lack of sprites as a weakness, since I was not (and still am not) into game playing on computers.  I viewed sprites as part of the "toy personality" of the C-64.
Whether one thinks of sprites as toys or not, redefinable graphic overlays are useful things. (Mouse cursors, status displays, etc.)

BigDumbDinosaur

Quote from: nikoniko
Quote from: bigdumbdinosaurActually, the 8568 version found in the 128DCR was more technically capable than the 8563 (more adaptable to various display configurations), and did have an unused interrupt capability, which was tied to the status register's bit 7.
Would you care to elaborate on that interrupt capability? Sounds interesting.
There isn't too much about which to elaborate, as the 8568's /IRQ line is not connected to anything (it's brought out on pin 9 of the chip's package).  Someone with reasonable skill could wire it to the MPU's /IRQ line, which would make the 8568 another interrupt source.  Incidentally, except at the software level, the 8568 is completely incompatible with the 8563.  The pinouts are different, some key register settings are different, local memory interface is totally different, etc.

As far as using the 8568's IRQ, what it does is reflect the status of bit 7 in the control register at $D600.  After writing a register number to $D600, the MPU would normally spin in a loop, waiting for bit 7 in $D600 to be set.  When that happens in the 8568, /IRQ will be asserted (pulled low), which would generate an IRQ if pin 9 of the chip was actually tied to the MPU /IRQ line.  That is, /IRQ is asserted upon a 0 to 1 transition of bit 7.  Upon a read operation of the status register, the 8568 will deassert /IRQ.  That's all there is to it...sort of.

You may know that video in an industry-standard PC is/can be interrupt-driven.  In theory, the same could be done with the 8568.  The rationale is that a measurable amount of time may elapse from when the desired register value is stored into the VDC control register to when the VDC is actually ready for the read or write to the data register.  So, why not have the VDC control subroutine write the control register value and then instead of looping and watching $D600 for a bit 7 transition, immediately return to the caller.  Later, when the VDC finally reacts to the register setup instruction it would generate an IRQ saying, "I'm ready for your next instruction."  The IRQ handler would then complete the operation.  The theoretical advantage to this scheme is that the processor wouldn't be wasting time waiting for the VDC to react to the register setup step.  Thus it could be theorized that the system's overall throughput would be increased.

As to whether trying something with the 128D is worth the bother, who knows?  The VDC is actually faster in some ways than the 8502 running at 2 MHz (the VDC can move data in its local VRAM faster than possible with 8502 instructions), so the work it would take to rewrite the VDC driver code to work with IRQs might not produce a result that is commensurate with the effort involved.  I suspect the interrupt capability may have been meant for a different system design running a faster processor on a faster bus, where IRQ-driven video would make more sense.
Quote from: nikoniko
Quote from: bigdumbdinosaurIncidentally, I never saw the lack of sprites as a weakness, since I was not (and still am not) into game playing on computers.  I viewed sprites as part of the "toy personality" of the C-64.
Whether one thinks of sprites as toys or not, redefinable graphic overlays are useful things. (Mouse cursors, status displays, etc.)
Perhaps I should have worded that as "part of the recreation personality of the C-64."  My background is in large systems for business use, where things like sprites are not commonly found.  My interest in the 128 from the beginning was to harness it for business data processing.
x86?  We ain't got no x86.  We don't need no stinking x86!

smf

Quote from: bigdumbdinosaurIn fast mode with %00000011 written into the VIC's register $30 (which also kills raster interrupts -- don't try it from the keyboard), the system ran a good 2.2 times faster than it would in 1 MHz mode with video going to the VIC.  So, all-in-all, the VDC was okay -- a little hard to work with until the programmer had a clear understanding of what it was doing, but certainly more capable in most regards than the VIC.
I'd have prefered an 80 column 2mhz VIC. With a 2mhz processor and ram that could run fast enough that the VIC and CPU would never have to fight over the bus. Chuck in 8 bit color ram for foreground and background color per 8x8 matrix. With a switch to make it all go back to 64 mode, though in 85 you wouldn't probably have cared anyway. Most of the techniques that need 100% accurate timing came later.

A c64+ like that would have been much better and probably no more expensive to make than the c128.

As the c128 was just a stop gap measure to keep commodore in the news they just pulled pieces from all over the place & had no real design.

But I agree about the serial port, the 1541 should never have been released. They should have given up on software serial with the vic 20. I think a fast serial bus though would be just as effective as a parallel bus, it's not like either end could actually process data that fast anyway.

BilHerd


"So, why did the C128 fail?"

What is meant by "fail?"  The unit did sell in surprisingly large numbers, so in that regard, I don't think there was any kind of failure.  The 128 delivered as promised (a few teething bugs notwithstanding), so technically speaking, it wasn't a failure.

>> I heard it did almost a billion in revenue by the end.  Not bad for 5 months effort plus we had something to show before the amiga was ready. 

The real failure was in Commodore management.  CBM's efforts were being directed in too many directions, lacked any kind of focus, and, as pointed out earlier, got caught up in producing useless technology (e.g., the C16).  Had they stayed focused on one or two product lines, the C128 may well have taken over the 64's lead.

>> There was no focus, there was no single driver after Jack left.  Marshal Smith was taking advice from the CEO of Tandy.

I always felt that including CP/M capability in the 128 was pointless.  As noted above, by the time the 128 was available for purchase, CP/M was rapidly being shoved aside by MS-DOS, whose similarity to CP/M was sufficient to assure success.  CP/M was already dragging around a lot of Z-80 baggage and was largely obsolete by the mid-1980's.  Given that, why bother with what was quickly turning into a moribund operating system?  Using the large number of CP/M titles as an excuse didn't wash, as a lot of that software was from another era when microcomputers were much less capable.  For example, hi-res graphic support in CP/M was next to non-existent, so what good was it with a machine like the 128 that had hi-res capabilities?  In the case of the 128, giving it CP/M capability was nothing more than pouring some moldy old wine into a new bottle.

>> Only cost us $1 so it didn't hurt, but the real reason was that the CPM cartridge didn't work very well and to be compatible we had to get the function in spite of the cartridge (I didn't buy the argument that the 128 was so compatible that the CPM cartridge didn't work on it either).  Plus I would have had to make the power supply handle a little more than an extra half an amp (which cost more than the Z80 on the motherboard did), plus I used the Z80 to boot C128 mode when game cartridges started to dynamically toggle control lines that had only been thought to have been permanently tied to a signal up until then.

Having said that, I suppose if CBM had worked much harder to promote the 128's CP/M capability and had made CP/M more accessible to the uninitiated, they might have enjoyed some success in that area.  Again, however, CP/M was already a historical artifact in 1985 and it could be that during the time period from inception to actuality, CBM realized that they had expended too much effort on the 128's CP/M personality and thus didn't feel compelled to follow through in the support area.

>> They didn't do anything to promote it.  Course they didn't do anything to promote the Amiga.  In both cases they thought they could sit and wait for orders like the latter days of the C64.

As an aside, perhaps if someone other than Von Ertwine had been commissioned to make the CP/M port it would have been a better performing package and might have succeeded (CP/M on the 128 was pitiful in performance compared to native Z-80 powered machines).  I and others who were heavily involved with 128 software development in the latter 1980's felt that Von Ertwine's port was bloated and poorly executed (the business of switching between processors to perform I/O was downright dumb, although some of the blame for that rests with the hardware design -- again, why bother with the Z-80?).  My opinion is that the 128 would have been much better served by forgetting CP/M and focusing exclusively on its native mode capabilities, especially in the area of hi-res graphics on the 80 column side.

>> Von Ertwine was extremely talented and accomplished his task without complaint or excuse.  He had about 4 months to do the whole job and didn't get a real working 80 column chip until he arrived at the January CES.  Von hand edited the the CPM/Z80 instructions using a disk editor using exactly the same number of bytes on the disk and hand calculated the checksums, and all of this was in reverse order on in the sector with the sectors in reverse order as seen by the editor.  He did this the night before the show opening when the he was exposed to yet another bug in the 80 column chip.  He didn't bat an eye, he didn't blame the chip, he just made it work.  What we didn't need was to debug all new I/O calls: imagine that everything was for naught because of subtle corruption in disk access because of having to write and debug whole new sections of timing dependant code with a myriad of external devices and variables. (we did our own QA by the way, the QA department was busy learning Basic throughout this exercise).  There was a kernel table of I/O calls meant for exactly that, I support his decision (and may have even taken part in the decision) to use what existed and use the time given to us to do what truly required unique development.  If anyone on the team would have spent his time reworking something that already existed then he would have probably been taken off the team as superfluous.

We barely had text working in the chip, there was NO thought of a future graphic capability at that time, we had a graphics chip in the VIC, the 80 column was all about 80 columns.  We knew people could later develop whatever they want for the VDC later.  We only had 5 months from conception to custom chips and PCB, case and software.  We didn't sleep much back then.


I maintain that the C-64 compatibility mode was ill-advised and totally unnecessary.  If the focus on the 128 had been solely on it being a 128 and not all things to all men, the technology would have been far more powerful, developers would have been more inclined to develop for it, more machines would have been sold, and a self-sustaining "economy" would have arisen like that which embraced the C64.

>> Just my way of giving back to the people that took the time to write code in support of CBM.  They didn't have to worry that sales of the new machine would detract from their customer base rather than expand.  Anything else would have been the detraction.  making a machine do more does not make it less capable IMHO.  I maintain that yet another machine that couldn't build upon a software base, hell one the world's largest software bases, would have been ill-advised and unnecessary.  There was a ton of software the very first day it hit the streets... pretty important for a product with an expected lifetime of 14 months.

Regarding the occasionally-maligned VDC, it was more than adequate in the context of the 128, and generally reflected the chip-making technology of the time, especially as embodied in the CSG's capabilities.  Actually, the 8568 version found in the 128DCR was more technically capable than the 8563 (more adaptable to various display configurations), and did have an unused interrupt capability, which was tied to the status register's bit 7.  Obviously, someone somewhere at either CSG or Commodore itself thought that an IRQ-driven 80 column display might be something worth exploring, although, as we all know, that too was never developed.  Incidentally, I never saw the lack of sprites as a weakness, since I was not (and still am not) into game playing on computers.  I viewed sprites as part of the "toy personality" of the C-64.

>> The VDC was a piece of shit.  The designer felt that interrupts weren't needed since you could poll it at any time.  I was so used to working with chip designers that understood how their chips were used in real life that I was slow to ask questions that I should never had to have asked.  I would have been better off with 6845.  Blame me for listening and accepting when I should have grilled the designer instead of accepting the word of his well meaning manager.

It is true that the VDC is/was somewhat obtuse to control.  However, it was no more difficult to work with than other CRT controllers of the era, such as the 6545e, on whose architecture the 8568 controller was based.  The VDC had a huge advantage over the VIC in that it did not need to usurp bus cycles to do its job, which meant that even in slow mode, the 128 would run faster if only the 80 column side was in use.  In fast mode with %00000011 written into the VIC's register $30 (which also kills raster interrupts -- don't try it from the keyboard), the system ran a good 2.2 times faster than it would in 1 MHz mode with video going to the VIC.  So, all-in-all, the VDC was okay -- a little hard to work with until the programmer had a clear understanding of what it was doing, but certainly more capable in most regards than the VIC.

>>Piece of shit.  Layout was poor in the chip, large clock skews, designed to accentuate metastability instead of retard it.  I had to phaselock it to the VIC chip just to get it working in time for CES.  Something we did with only 8 days to gio when it became apparent that the designers weren't really fixing anything in the last release.  In fact the last release was totally broken when I got it, they had stuck a back bias generator in there cause, you know, it was more important than making the chip work.  I had to ground the substrate of the chip and put 330 Ohm pullups on the datalines just to make it work ffor the show.

One place where CBM definitely dropped the ball was in I/O capability.  Instead of kludging the lame serial bus with the "fast" routines, true IEEE-488 capability should have been native, and not the slow IEEE-488 of the old PET/CBM line (which only ran at 1.2 KB/sec, a fraction of what IEEE-488 was capable of).  A real RS-232 port should have been present, the I/O block should have been more finely decoded to allow more devices to share that part of the memory map and a Centronics port should have been available to provide native support to mainstream printers.

>> Now THAT would have increase the cost and been outside of the delivery schedule and not fit the case :)


Regarding BASIC 7.0, I didn't do much development in it because it was too slow to do what I wanted.  Also, despite the greatly extended language offered in the 128 mode, it still was lacking in a number of hardware support areas.  For example, why wasn't TI and TI$ tied to one of the TOD clocks instead of the unreliable IRQ-jiffy "clock?"  Why weren't there any keywords to manipulate the user port lines, instead of having to PEEK and POKE?  Where was the support for direct control of the VDC (rather than having to go through the screen editor ROM)?


>> Written in 5 months and still took the time to support DMA.  Sorry you didn't like peeks and pokes. The OS provided the operating stuff, the apps can always directly manipulate.


All-in-all, the 128 delivered no more or less than promised.  It simply didn't deliver enough to make it a clear choice over anything else available at the time.  Running in 64 mode, it did little more than a real 64, but at a much higher cost.  Running as a 128, it didn't have enough processing speed or capacity to make it a viable choice against the PC architecture of the time, nor did it have anything to offer against the Amiga (which was technically superior to everything else at the time).  In CP/M...well as a CP/M machine it made a pretty good door stop.  :-)

>> 5 ponds of shit in a 9 pound bag.  Basically sweeping all of the leftovers into a doggybag as the last 8 bit project.  People who already had a C64 bought a C128, people who wanted 80 columns might have tried one also, one step closer to a real word processor... which by the way we had day one from CPM.  Whetehr any of it got used or if it got used wasn't a variable we were trying to solve for, we provided the capabilities to the best of our abilities, and let teh users decide what worked for them and what didn't.  If asked I would have predicted that I would be at least 45% wrong in guessing what users ultimately would use, and since I didn't know which 45% that was, we put it all in.

Still, made $200 million revenue in the first year or so, so not a bad failure.


Bil Herd


BilHerd

Quote from: smf on December 29, 2007, 07:41 PM
Quote from: adminSo why did Commodore drop the ball so badly with this machine ?
The 128 was too expensive. After jack left commodore didn't know what to do anymore.

If jack had stayed then the 128 would have been cheaper, the z80 & 80 column would have not existed.

The real mistake that commodore made was allowing the c64 and a500 to be so successful because they didn't manage to replace them. This left them very vunerable.

Lol... this reminds me of a game we used to play.  There was a whole layer of managers that presumed to know what Jack wanted, they would walk around quoting their version of the gospel, I would bet that the majority had never even talked to Mr. Tramiel.  We used to plant false Jaskisms, as obtuse as saying that perhaps Jack likes Petunias.  Sure enough, at some point a manager would want us to account for Petunias or some other silly thing. 

In contrast their were people like Bob Russell.  Bob had survived an encounter with Jack where he was wrong to listen to a manager (told him to counter ship the Vic20 demo unit instead of hand carry... bottom line is the meeting with the investors was sans the Vic20).  Bob could truly walk through the gateway into International and speak directly with Jack and even challenge some of the rhetoric of the day.  Needless to say the Petunia spouting managers were envious if not jealous and threatened.

As far as what jack would have done to the C128, I would have loved his input throughout the project. As it was I had negotiated one constraint early on, that was the cost to produce, and I met my comitment.  From my previous experience with Jack  would venture that he would have challenged me to make it cheaper anyways but not necessarily at the expense of marketshare opportunities  With that said, I defy anybody other than those closest to him to trully "know" what Jack would do, he could surprise people.

Ever wonder about the eclectic mix of TTL parts and no PAL's?  I had pulled the stock report for overseas and designed the overstocks into the C128 as an inventory burner.  At the time we had over 2 million 7406's in stock.  The design was full of "free" parts for the first year, minus the cost to insert and the failure rates of increase parts count which   The Z80 reduced price not increased it.  As it was, we didn't tell management about anything we were trully doing until it was done.  The rule was whatever worked when we got to the CES show became the frozen design except for the most minor of changes... we literally started the FCC process when we got back and anything beyond a class 1 change endagered shipping to K-Mart in June. 

This rule did cost me linear expansion to 512k/1mb and I didn't like it but couldn't disagree. The PCB was laid out for it and I hid the upgrade in a rev of the MMU, but they found out about the added capability and canceled the chip rev.

I am sure there were those who thought that it could have used more Petunias in the final design tho... lol

Bil Herd

StyleCHM

Bil, just a quick question - was there ever any thought given to producing a 2Mhz VIC and using 4Mhz RAM? Something like a doublespeed c64? :)




BilHerd

Quote from: WonderSlug on January 04, 2008, 03:33 PM
Another thing about the C128 is that one of it's greatest strengths, the Enhanced Basic 7.0 language built into the machine, with the dozens (or even hundreds) of extra commands to do nearly everything one could hope for, ended up little used by C128 owners.

It was probably the best and most powerful version of BASIC out there, in the mid-1980s, and yet, very few knew much about it.

Of all the C128 programs out there, how many of them used this new BASIC language?  Not as many as they should.

Terry Ryan with assistance from Fred Bowen did these advanced commands pretty much on their own, unless you count the fact that they were threatened with being fired if they continued to introduce structured commands.  In the end the manager admitted that it perhaps had been the right thing to do but nonetheless they had ignored him, and so they got very poor performance reviews.  Basic 7.0 was done because it could be done, it felt right and we weren't seeking permission from  add-on managers that attached themselves to the project after they saw it was going somewhere.

All in all, a good time was had by all.

Bil

Andrew Sutton

After reading the "Commodore Book" you can get a good idea why the C128 didn't seem as successful as the C64. I give Bil a ton of credit for the work he performed on the C128. I can't get over how much Bil accomplished at Commodore being a high school dropout! As impressed as I was, I acquired a C128 to see what I missed by not "upgrading" from my C64 back in the day... quite a bit!!
When I had a C64 back in '85, I never considered a C128, let alone an Amiga. The C128 seemed more business oriented, as well as the Amiga. Commodore did a lousy job of marketing either computer. In my opinion, the C128 and Amiga were fighting for the same turf, and the C128 was outgunned. Don't get me wrong, I like the C128, but the "stepchild" Amiga seemed more advanced.
"We made machines for the masses, they made machines for the classes," Jack Tramiel

            telnet://commodorereloaded.servebbs.com

BilHerd

Quote from: StyleCHM on January 28, 2008, 11:27 AM
Bil, just a quick question - was there ever any thought given to producing a 2Mhz VIC and using 4Mhz RAM? Something like a doublespeed c64? :)


None whatsoever. :) 

We barely got the system working with the technologies of the day as it was in all honesty.  The C64 had a large failure rate as it had several timing violations, mostly due to a lack of edges from the master clock with which to clock events. The lack of symmetry of the clock when using both rising and falling edges was completely beyond control, we simply didn't have affordable clocks at higher frequencies.

The 64 and the 128 already run at 2mhz as both the Vic chip and the 6502 get only half a cycle.  150ns DRAMS were fairly new, in fact they caused their own failure modes in the C64 as the data would go away quicker at the end of the cycle and blow the data hold. 

So the way we used the budget of timing for the day was first we had to tristate the vic or the 6502.  This takes a LONG time in the scheme of things, NMOS needs (relatively) huge drivers which have large capacitances on their internal gates which causes delays as the charge bleeds off to allow thee driver to turn off.  At the same time we had to make sure the device coming on didn't come on too early and smash the buss, so right off the bat we have used 50-80ns to get into the cycle.

From there we had to wait for the next edge that gave us RAS for the DRAMS, RAS too early even by a bit was a disaster, too late was a waste.  Then we  had to multiplex the address using the most convenient edge that didn't blow RAH (row address hold).  We were also doing some switching of some things in NMOS which was SLOW, any decision in the MMU or PAL took 50 -90ns on top of whatever edge if memory serves.  Then same with CAS and then the soak time for data coming out and going in.  Then, hopefully, a graceful end to the cycle where we left the data hanging out long enough to satisfy everybody's hold time.

We were 74LS based also, the F series was expensive and radiated off the band for FCC and created video noise, so we lived with 10-20ns switching times.

To do this we were using the fasted consumer priced DRAMs of the day, 150ns.  I didn't see a 120 ns until much later but to go one integer faster we would have needed a 28mhz and something like a 70or 80 ns DRAM which didn't exist.  We would have had to go to Static rams to get a second cycle in and the cost and SIZE was huge, all statics were pretty much in the doublewide packages.

So really we were at the limit of a family, technology and opportunity for the day. On top of all of that, chips like the 8563 didn't live up to the timing "family", I.E. they didn't provide the right times for their speed grade, and THEY were at their limits also.

so in short I was already using the fasted 6502/6510 for the day which was the 2 mhz version for it's half of the 1 mhz cycle.  Going faster meant taking both cycles.

Back then I don't think you would sell more units if you were x percent faster, comparative speed wasn't a deciding attribute between the competing computers, not when one had sprites and one didn't.  These days where the architecture is essentially the same then performance becomes a deciding factor.

As an example: Did anyone who ever wanted a C64 ever decide to get a TI instead because the serial buss was slow?

Hope this helps explain the lay of the land for the day.

Bil

Andrew Sutton

Good point. The C64 was "superior" to the TI99/4a, and the price was right too!!! ;D
"We made machines for the masses, they made machines for the classes," Jack Tramiel

            telnet://commodorereloaded.servebbs.com

airship

I just love when Bil posts. I learn so much!

Just for the record, I would have LOVED for INFO magazine to have been able to stay 8-bit and focus on the C128. I loved that little machine. I never felt the same about the Amiga. I realized the business reasons we had to go all-Amiga, but the C128 was (and is) the funnest computer ever built. IMHO.
Serving up content-free posts on the Interwebs since 1983.
History of INFO Magazine

nikoniko

#41
Quote from: BilHerd on January 27, 2008, 08:34 PM
I would have been better off with 6845.

Sure sounds like it. Perhaps something like what Amstrad did for the CPC with its 6845 and simple pixel generator would have been less of a headache?

BilHerd

Quote from: Michael Hart on January 28, 2008, 01:37 PM
Quote from: BilHerd on January 27, 2008, 08:34 PM
I would have been better off with 6845.

Sure sounds like it. Perhaps something like what Amstrad did for the CPC with its 6845 and simple pixel generator would have been less of a headache?

The book was open on my desk to the timing specs for the 6845 when the manger of the chip group, Bob Olah, stopped by, coffee cup in hand, and said "I heard your looking for an 80 column chip".  For teh record I was going to do color planes so it wouldn't have been monochrome. :)

I don't regret having made the decision, we made many a day, but I could have done a better job vetting the chip.

I remember when I heard the text block move was only 256 characters (2 1/2 lines).  I looked at the chip guy "huh?".  "Why would you need more?" he responded.  My response was to call a crowd around for the explanation as they were generally entertaining and enlightening.  He never got the complete difference between starting a block move and going on with things (linear scope) and having to then wait and count for the entire page to scroll, 2.5 lines at a time, before  you could finally move on (stateful in place).  Same with interrupts, why would you go on to do something else when you could sit in a tight loop polling a bit.

To make that point we used to pick up the phone over and over rather than wait for it to ring, we figured that it  demonstrated the efficiency of polling to see if someone was on the phone  rather than wait for the annoying ring/interrupt.  The bartenders at the local bar Margarita's wondered why we always abusing their phone. (cause it kept us from abusing a chip designer)

BigDumbDinosaur

Quote from: BilHerd on January 30, 2008, 06:12 AM
I remember when I heard the text block move was only 256 characters (2 1/2 lines).  I looked at the chip guy "huh?".  "Why would you need more?" he responded.  My response was to call a crowd around for the explanation as they were generally entertaining and enlightening.  He never got the complete difference between starting a block move and going on with things (linear scope) and having to then wait and count for the entire page to scroll, 2.5 lines at a time, before  you could finally move on (stateful in place).  Same with interrupts, why would you go on to do something else when you could sit in a tight loop polling a bit.

Repeat after me: chip designers aren't programmers.  <Smile>

The 256 maximum byte block move/copy never made sense to me either.  After all, a scroll is a line-oriented thing, so why not use a 16 bit value to define how much to move/copy?  It would have been a cinch to scroll, say, 10 lines with a 16 bit byte count ($0320).  If you're doing a bitmap you could move/copy the entire visible screen with a single command!  Dumb, dumb, DUMB!  The 8563 might not have been a total piece of shit, but it was dangerously close to falling into the stuff.

Same with no IRQ.  Dumb!  If the 8563 had been endowed with full IRQ capabilities (register read, vertical retrace active, light pen active, etc.) the 128 screen kernel could have been a lot more efficient.  Tell the VDC which register you want to twiddle and set up a flag somewhere to indicate if the operation is a read or write.  If a write, deposit the byte in a mailbox for later.  Go on about your business.  If a read, just go on about your business.  When the IRQ comes from the VDC, see what the operation is supposed to be, read from or write to the VDC.  Go on about your business.  Sounds like a typical interrupt-driven I/O scenario, eh?

Oh well!  Guess we can't redesign the machine, can we?
x86?  We ain't got no x86.  We don't need no stinking x86!

nikoniko

Quote from: BigDumbDinosaur on January 31, 2008, 11:50 AM
The 256 maximum byte block move/copy never made sense to me either. 

The C128 team must have been even more delighted when they found out that not only was there a miniscule limit, but sometimes the 8563 would fail to copy the last byte. :)

BigDumbDinosaur

Quote from: Michael Hart on January 31, 2008, 12:54 PM
Quote from: BigDumbDinosaur on January 31, 2008, 11:50 AM
The 256 maximum byte block move/copy never made sense to me either.

The C128 team must have been even more delighted when they found out that not only was there a miniscule limit, but sometimes the 8563 would fail to copy the last byte. :)

I'm sure Fred Bowen and his associates were probably ready to slap CSG upside the head over it.  I recall tripping over that little last byte problem several times before I found out it was an "official bug."  The workaround (it seems almost everything associated with programming a Commodore 8 bit computer is a workaround) was to not try to copy a whole page at a time.  As I said, Dumb, Dumb, DUMB!
x86?  We ain't got no x86.  We don't need no stinking x86!

LuxOFlux

#46
The step up from the C64 seemed too small for me. Being a programming and gaming type.

Besides, Amiga's and Atari ST's and PC's were on the horizon.

If it had been 16 bit with extra sound and expanded resolution I would have gone for it though.

Lucas

RobertB

Quote from: LuxOFlux on March 08, 2008, 06:08 AM
The step up from the C64 seemed too small for me. Being a programming and gaming type.
The C128 was the perfect step up for me back in 1985.
Quote from: LuxOFlux on March 08, 2008, 06:08 AM
Besides, Amiga's and Atari ST's and PC's were on the horizon.
And it was an expensive horizon back then.  Far cheaper to get a $200 new C128 and a $200 new 1571.

            Truly,
            Robert Bernardo
            Fresno Commodore User Group
            http://videocam.net.au/fcug
            The Other Group of Amigoids
            http://www.calweb.com/~rabel1/

Andrew Wiskow

Quote from: RobertB on March 09, 2008, 06:09 PMThe C128 was the perfect step up for me back in 1985.

The C128 was the perfect step up for me back in 2007!  ;)

-Andrew
Cottonwood BBS & Cottonwood II
http://cottonwood.servebbs.com

BigDumbDinosaur

I bought a C-128 two days after it went on sale in 1985.  To me, it was a substantial step up from the C-64, mostly because of the 80 column display and expanded keyboard (I'm not into BASIC programming).  In mid-1987, I acquired a Lt. Kernal subsystem and shortly thereafter, bought two C-128DCRs on which to do software development.  All three machines were multiplexed to the Lt. Kernal subsystem, and I ran with this mess until mid-1994.
x86?  We ain't got no x86.  We don't need no stinking x86!