DigiKey-eMag-Smart Manufacturing-Vol 17

In 1964 IBM released a new flagship computer product line called System/360. Some historians mark this as the formal beginning of the ‘Computer Age.’ These computers cost over a million dollars, so most of them were leased by service providers. These leases could cost over $80,000 a month, with the higher-end models costing up to $115,000 a month. Often these computers would be leased to time- share companies, and they would sell mainframe time to their clients, but larger companies like General Electric would have their own computer. The problem was that there was no practical way for the manufacturing industry to take advantage of them, no matter how much they cost. How does having a five-ton machine in a hot room help someone like General Motors make more automatic transmissions? The economic growth of the late 1960s was ending, and manufacturers knew this. The industry was reaching the limits of what technology was capable of.

“Although designed for war, the machine holds possibilities of better living in peace.”

try to make a mass-producible computer, but they were far more limited by the day's technology than they realized. The following year, 1947, three engineers at Bell Labs developed the first transistor, which allowed current to be switched on and off, like a vacuum tube but as a solid-state device. You know this, but the collision of those two inventions drove the change in the twentieth century. It took a while to get it worked out, but Bell Labs, in conjunction with the US Air Force, developed the first transistorized computer, TRADIC (TRAnsistorized DIgital Computer), in 1954. TRADIC used several hundred ‘point contact transistors’ and over ten thousand germanium diodes. While TRADIC was a milestone creation, it was primarily used for R&D and never meant for any commercial use. By the end of the 1950s, it was clear to the industry that computers were the next big thing, but even into the 1960s there were still a lot of vacuum tubes, filament bulbs, and relays.

The world of computers up to 1968 The first programmable controller was not considered a computer at the time. Lore has it that, during the design process, any notes that had the word ‘computer’ on them were balled up and thrown away on sight. At the time, ‘computer’ was a four- letter word that took an entire wing of a building to maintain and a monthly air conditioning bill that would make any bean counter cry. This was still years before the release of the Intel 4004 or the MOS 6502. The first electric computer was announced in 1946. Weighing thirty tons, the ENIAC (Electronic Numerical Integrator and Computer), also known as the ‘Mathematical Robot,’ was designed for the US Army to assist with advanced calculations such as artillery trajectory and weather forecasting. World War II had effectively ended before ENIAC could be publicly announced. The machine was alive with the clicking of relays and the heat of filament vacuum tubes. Soon after, companies were quick to

Challenges of Pre-PLC automation

Before the PLC became available on the market in 1969, engineers and manufacturers encountered various challenges when trying to automate machinery and processes.

we get technical

31

Powered by