Talk:History of computing hardware
This is the talk page for discussing improvements to the History of computing hardware article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
Archives: 1, 2Auto-archiving period: 12 months |
History of computing hardware is a former featured article. Please see the links under Article milestones below for its original nomination page (for older articles, check the nomination archive) and why it was removed. | |||||||||||||
This article appeared on Wikipedia's Main Page as Today's featured article on June 23, 2004. | |||||||||||||
| |||||||||||||
Current status: Former featured article |
This level-4 vital article is rated B-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||||||||||||||
|
Old discussions
[edit]Old discussions have been moved to archives - use the navigation box to switch between them. I used the much nicer {{archives}} and {{archivesnav}} templates as found on the Personal computer talk pages to spruce up navigation a little. Remember when creating new archive pages that they must have a space in the title - talk:History of computing hardware/Archive 3 would be the next page, for example. --Wtshymanski (talk) 01:35, 25 September 2009 (UTC)
Commodore
[edit]Call me a massive geek, but surely the C64 and Amiga deserve some mention in here. The advancement in personal computers isn't just down to the number of transistors - those computers added some really creative features (particularly with sound hardware) which we now take for granted. Their rise and fall (there's a certain book by a congruent title) is a huge chapter in the history of computing, surely..
Copyedits needed for 2nd generation section
[edit]In the interests of keeping this article a featured article, might we move the latest contribution on 2nd generation computers to the talk page and work on the English prose before re-instating it to the article page? --Ancheta Wis (talk) 14:17, 2 January 2008 (UTC)
- Hi, thats my work! What is wrong with it? 92.1.67.188 (talk) 14:35, 2 January 2008 (UTC)
- Hi, I replied on your talk page to redirect here.
- The expert you allude to on your first paragraph was Thomas J. Watson
- The Von Neumann pre-print of 1947 went all over the world. That is how Israel built its first computer, for example. Russia did the same. What the 1954 date on Italy's first computer shows is that they built either on Von Neumann's architecture or they studied other documents. In any case, a citation would be good.
- Did the IBM 1401 use only transistors?
- Does tenths of thousands mean 10000 or 100?
- Hi, I replied on your talk page to redirect here.
- In any case, I think you see what I mean. The English needs copyediting. I am not referring to your content, with the exception that we need citations. --Ancheta Wis (talk) 15:19, 2 January 2008 (UTC)
- I have commented out the contribution, as the English needs copyediting. The sentences are disconnected, the timeline of development for second generation is nonsequential, there is no flow from one statement to the next. --Ancheta Wis (talk) 10:47, 17 February 2008 (UTC)
- In any case, I think you see what I mean. The English needs copyediting. I am not referring to your content, with the exception that we need citations. --Ancheta Wis (talk) 15:19, 2 January 2008 (UTC)
computers are an amazing creation of sensation. —Preceding unsigned comment added by Sckater (talk • contribs) 21:50, 8 March 2008 (UTC)
question
[edit]i was wondering whether the Antikythera mechanism is the first computer, cause their are lot articles that make that claim.Tomasz Prochownik (talk) 21:05, 23 April 2008 (UTC)
- Follow the link for the latest thinking on the matter. --Ancheta Wis (talk) 23:18, 23 April 2008 (UTC)
Citations needed
[edit]Fellow editors, User:Ragesoss has noted that we are building back up the citations for this FA. When this article was first formed, the rise in standards for Featured Articles had not yet occurred. Since I have been volunteered for this, there will be an American bias to the footnotes I am contributing; please feel to contribute your own sources.
Please feel free to step up and add more citations in the form of the following markup: <ref>Your citation here</ref>. You can add this markup anywhere[1] in the article, and our wiki software will push it to the <references/> position in the article page, individually numbered and highlighted when you click on the ^. As an illustration, I placed this markup on the talk page so that new users can even practice on this talk page.
In my opinion, the best source is Bell and Newell (1971)[2], which is already listed in the article. I do not have time to visit the local university library, so my own contributions are from sources which I have on my own bookshelves; this may be appropriate since the seminal period 1945-1950 will probably be viewed as the heyday of first generation of electronic digital computers, which blossomed in the US, 1945-1950.[3],[4],[5],[6],[7],[8],[9] I recognize that there will need to be more citations from the Association for Computing Machinery and the IEEE Transactions, but that will have to come from those editors who are in the Wikiproject on computing. In particular, the Radiation Laboratory of MIT published a series of books The M.I.T. Radiation Laboratory Series[10] which are the foundation for computing hardware, in tandem with the Manhattan Project; what is common to these projects is that they involved groups of cooperating contributors.[11] Before the howls of outrage subside, please note that the exact forms of computer hardware had not yet been selected in this period, but since the technologists were already in place for other purposes, it was a small step to the forms of hardware we see today.[12],[13],[14],[15],[16],[17], [18] The forms of hardware could easily have gone in other directions, and our current computers would have been different from what could have been.[19] [20]
New users (especially those with a CS or EE background ), please feel free to contribute your citations. Wikipedia:Five Pillars summarize the guidelines for editors, and your cheatsheet for markup can be found here. Users can append comments to the foot of this talk page, signed with the signature markup: --~~~~
Casual readers might note that the references which will be added to this article can be purchased quite cheaply on the Internet (typically for a few dollars), which in sum would amount to a nice education in this subject. --Ancheta Wis (talk) 09:31, 3 May 2008 (UTC)
We are up to 59 footnotes. You can examine the edit history to see how the citations were embedded in the article, as well as study this section, for examples on how to do it. --Ancheta Wis (talk) 10:01, 6 May 2008 (UTC)
User:SandyGeorgia has noted that the citations are expected to have a certain format. Everyone is welcome to improve the citations. --Ancheta Wis (talk) 01:42, 7 May 2008 (UTC)
It appears that the footnote macro is space-sensitive. For example <ref name=IBM_SMS/ > works, but <ref name=IBM_SMS/> causes error messages unless a space is added after the trailing slash. To see this, look at this diff --Ancheta Wis (talk) 09:42, 9 May 2008 (UTC)
Sample citation format from User:Wackymacs:[21]
- This one was formatted incorrectly. There should be a "|" in between the url and the accessdate like this:[22]
References sample illustration
[edit]- ^ Your citation here
- ^ Gordon Bell and Allen Newell (1971) Computer Structures: readings and examples ISBN 0-07-004357-4
- ^ Herman Goldstine's 1947 First Draft of a Report on the EDVAC, which was mimeographed and distributed worldwide, had a global effect, producing von Neumann-architecture computer systems world-wide. For example, the first computer in Israel was built this way.
- ^ Federal Telephone and Radio Corporation (1943, 1946, 1949), Reference Data for Radio Engineers
- ^ The Jargon File, version 4.4.7 The Jargon file
- ^ Charles Belove, ed. (1986) Handbook of modern electronics and electrical engineering, ISBN 0-471-09754-3
- ^ Sybil P. Parker, ed. (1984) McGraw-Hill encyclopedia of electronics and computers ISBN 0-07-045487-6
- ^ Arthur B. Glaser and Gerald E. Subak-Sharpe (1977), Integrated Circuit Engineering ISBN 0-201-07427-3
- ^ Richard H. Eckhouse, Jr. and L. Robert Morris (1979), Minicomputer Systems: organization, programming, and applications (PDP-11) ISBN 0-13-583914-9
- ^ For example, John F. Blackburn (1947), Components Handbook, Volume 17, M.I.T. Radiation Laboratory Series, Lexington, MA: Boston Technical Publishers
- ^ "I must say that I did not design Windows NT -- I was merely one of the contributors to the design of the system. As you read this book, you will be introduced to some, but not all, of the other contributors. This has been a team effort and has involved several hundred person-years of effort." -- Dave Cutler, Director, Windows NT Development, in the foreword to Inside Windows NT, ISBN 1-55615-481-X, by Helen Custer, p. xix.
- ^ Ron White (1995), How Computers Work ISBN 1-56276-344-X
- ^ Scott Mueller (2002), Upgrading and repairing PCs ISBN 0-7897-2683-1 CHECK_THIS_ISBN
- ^ Harry Newton (1998), Newton's Telecom Dictionary ISBN 1-57820-023-7
- ^ George McDaniel, ed. (1993), IBM Dictionary of Computing ISBN 0-07-031489-6
- ^ Paul Horowitz & Winfield Hill(1989). The Art of Electronics ISBN 0-521-37095-7
- ^ David A. Patterson and John L. Hennessy (1998), Computer Organization and Design ISBN 1-55860-428-6
- ^ Alan V. Oppenheim and Ronald W. Shafer (1975), Digital Signal Processing ISBN 0-13-214635-5
- ^ W.J. Eckert (1940), Punched card methods in scientific computation, Lancaster, PA: Lancaster Press
- ^ Robert Noyce's Unitary circuit, US patent 2981877, "Semiconductor device-and-lead structure", issued 1961-04-25, assigned to Fairchild Semiconductor Corporation
- ^ Jones, Douglas W. accessdate=2008-05-15 "Punched Cards: A brief illustrated technical history". The University of Iowa.
{{cite web}}
: Check|url=
value (help); Missing pipe in:|url=
(help) - ^ Jones, Douglas W. "Punched Cards: A brief illustrated technical history". The University of Iowa. Retrieved 2008-05-15.
Zuse and Von Neumann
[edit]According to Hennesey and Patterson, Von Neumann knew about the details of Zuse' floating-point proposal. This suggests that the sentence 'Zuse was largely ignored' should be stricken. Any objections? --Ancheta Wis (talk) 10:30, 5 May 2008 (UTC)
Zuse did not implement the floating-point design he patented in 1939, before WWII ended. Von Neumann was aware of Zuse's patent and refused to include it in his Princeton machine, as documented in the seminal paper (Burks, Goldstine and von Neumann, 1946). -- Hennesey and Patterson p.313, note "A decimal floating point unit was available for the IBM 650, and [binary floating-point hardware was available for] 704, 709, 7090, 7094, ... ". "As a result, everybody had floating point, but every implementation was different." .
To this day, floating point operations are less convenient, less reliable, and more difficult to implement (in both hardware and software). -Ancheta Wis (talk) 08:07, 10 May 2008 (UTC)
'First electronic computer'?
[edit]This assertion is made about the Colossus in this article. It is also made about the ACE in that article. THERE CAN BE ONLY ONE! Twang (talk) 18:59, 10 May 2008 (UTC)
- On the other hand, the article also states "Defining a single point in the series as the "first computer" misses many subtleties." thank you for BEING BOLD! You are welcome to contribute to the article and the talk page! --Ancheta Wis (talk) 20:34, 10 May 2008 (UTC)
- Not to be too pedantic, but the article is an example of how a recurring need (in this case, the need to calculate) gets met multiple ways, at multiple times, by multiple people trying to solve a problem. For example, Pascal was trying to help his dad collect taxes; ENIAC was used to fight a war by calculating the trajectories of artillery shells; Zuse was trying to ease the burden of his engineering work; Colossus was trying to decode secret messages; IBM was trying to extend the use of its punch card machines for business purposes; Maurice Wilkes was excited about the possibilities of the First Draft of the Design for EDVAC. You get the idea: it's asking 'What does the first mean?'. As we now know from spacetime, time depends on the observer - what does first mean in that case? It only has meaning in the context of a thread. Thus clearly, Maurice Wilkes came after ENIAC, but before the implementation of EDVAC. Colossus was secret, so it was part of a different thread, by definition. And in the article, there is evidence that von Neumann knew something of the ideas of Zuse, so the design and architecture of EDVAC is after Zuse. However, you cannot say that the implemented EDVAC is after Wilkes' machine implementation - they are parallel threads which branched after Wilkes was influenced by the First Draft. These ideas are part of Lee Smolin's book Three roads to quantum gravity ISBN 0-465-07835-4 pp.53-65. (As you can see, classical logic needs to be reformulated. The world is not monotonic.)
I don't have Smolin's book in front of me so I can't give you a page number right now.And I can't put what I just wrote in the article because I don't have a citation other than Smolin, which isn't explicitly about computing hardware (it's about physical processes in general). --Ancheta Wis (talk) 21:07, 10 May 2008 (UTC) - Just following up about ACE, the Automatic Computing Engine. It's the same idea. Turing owed nothing to EDVAC. So there are other editors who have the same kind of reasoning as Smolin's work, stated above. However, just Turing's knowledge that EDVAC is possible said a lot to him -- the ACE solution also has to obey the laws of physics, like EDVAC; thus the ACE problem solvers had a lot less work to do when solving their specific issues on the way to a goal.
- These kinds of problems, about priority and independence, are being solved with clean rooms, where developers work in isolation from other implementers. This is all faintly antique for anyone in the open source movement; all that has to be done in open source is to include the provenance of the code base, to keep it Open.
- That's where Wikipedia can make its mark on world culture: we can keep everyone honest about who owes what to whom, by citing our sources. This article clearly states that von Neumann owed much of his First Draft to Zuse, Eckert/Mauchly (who owe something to Atanasoff/Berry) and the rest of the inventors who came before him. And Wilkes (and the rest of the world) owe much to von Neumann, etc. Since Turing's ACE does not have priority over Wilkes' machines, the ACE article should probably heavily qualify the meaning of first in its text. That brings us to Emil Post, the American logician who is independent of Turing, but who waited too long to publish. (He had his ideas 15 years before Turing's 1936 publication...) --Ancheta Wis (talk) 21:39, 10 May 2008 (UTC)
Contributions welcomed.
[edit]Fellow editors, you are welcome to make your contribution to this article. See the sections above for examples on adding citations. Be Bold.
--Ancheta Wis (talk) 10:43, 11 May 2008 (UTC)
ENIAC 1,000 times faster than its contemporaries
[edit]The article currently states "(Electronic Numerical Integrator and Computer) .... it was 1,000 times faster than its contemporaries." As it is stated that ENIAC was Turing complete, if it had been programmed to break "Tunny" would it have been 1,000 times faster than Colossus? If not then this sentence needs changing. --PBS (talk) 10:08, 13 May 2008 (UTC)
- If we are comparing electromechanical relays to vacuum tubes then the statement is correct. But Tunny came after ENIAC, so it is a descendant, and not a contemporary, which would have been Z1 (the only unclassified project).
- You might change the article page, for example, replacing contemporaries with Z1 in the statement. Citations are welcomed. This page needs more contributors! --Ancheta Wis (talk) 03:35, 15 May 2008 (UTC)
- The sentence has been changed. --Ancheta Wis (talk) 08:41, 19 May 2008 (UTC)
The number of pictures
[edit]Ancheta Wis, you're doing amazing work here - but don't you think the article should have less pictures? — Wackymacs (talk ~ edits) 06:23, 15 May 2008 (UTC)
- Thank you for your kind words. I propose to comment out Herman Hollerith, the Jacquard loom, the Manchester Baby, and others.
- Editors, you are welcome to contribute to this article and talk page. Be Bold. Citations wanted.
- --Ancheta Wis (talk) 10:06, 15 May 2008 (UTC)
- Good work. Still too many. Some images obscure section headings (in other words, push them out of order). Also, per WP:MOS, images should not be placed directly under a section heading on the left side. — Wackymacs (talk ~ edits) 10:10, 15 May 2008 (UTC)
Citations
[edit]It is no good adding lots of citations, when half of them are not formatted properly with the citation templates provided. Please see Wikipedia:Citation templates. All web citations should use the Cite web template, and must have an access date. Also, a lot of the current citations look questionable, and some are useless. (For example, the two citations in the lead explaining hardware and software) - Why? Wikipedia has articles on both of these. — Wackymacs (talk ~ edits) 10:45, 15 May 2008 (UTC)
- So the next step is to revisit the citations, using the sample you have provided and reformat. As part of the history of this article, when we did this, the footnote software had not yet reached its current state. I hope it is stable enough to rely on for the future. I have no objection to go back and revisit the footnotes, as I am a believer in the spiral development process. --Ancheta Wis (talk) 08:06, 16 May 2008 (UTC)
- The "Example 2 article text" appears to be a codification of the usage of ordinary wiki markup practices over the years. I propose reformatting the existing citations into that style. I must say that it appears to place human editors into the position of data entry clerks for the care and feeding of the citation monster. After reading Wikipedia:Citation templates, my reaction is that this article/policy? will evolve.
- My personal preference is for "Example 2 article text", and my guess is that any of the items in Wikipedia:Citation templates is acceptable to the FA reviewers. True statement? --Ancheta Wis (talk) 08:29, 16 May 2008 (UTC)
- You can either use {{Citation}} for everything, or a mixture of {{cite news}}, {{cite web}}, {{cite book}}, and so on. Both methods are acceptable at FA. — Wackymacs (talk ~ edits) 08:54, 16 May 2008 (UTC)
- My last re-format using the cite template ate the name of a co-author. I have to go now, and will return to this issue later. --Ancheta Wis (talk) 16:53, 17 May 2008 (UTC)
- You can either use {{Citation}} for everything, or a mixture of {{cite news}}, {{cite web}}, {{cite book}}, and so on. Both methods are acceptable at FA. — Wackymacs (talk ~ edits) 08:54, 16 May 2008 (UTC)
- This diff shows 27119 net bytes (a 33% increase) have been added to the article since 29 April 2008. I have attempted to address the concerns of Wackymacs (1c) and SandyGeorgia (1a) in the meantime. --Ancheta Wis (talk) 10:50, 19 May 2008 (UTC)
- All book footnotes should have specific page numbers. Ancheta Wis, can you start adding page numbers (assuming you have the books which are referenced in footnotes)? — Wackymacs (talk ~ edits) 16:50, 5 June 2008 (UTC)
- My books are largely in my basement with the exception of the 40-lb. box I dragged upstairs for the article. But some of the books I have not looked at since I left the semiconductor industry some decades ago, which does not mean I do not remember where I learned the fact, and which book title I have already cited. I am thinking of Mead and Conway, to be specific. To avoid time pressure, because I cannot predict where (in what box, as is probably apparent, I own thousands of books, not to mention 3 editions of Britannica) I will unearth the book, I will simply comment out those book refs which lack the page numbers. I will also try to conserve on byte in the references for the sake of the page limit. --Ancheta Wis (talk) 00:12, 6 June 2008 (UTC)
- All book footnotes should have specific page numbers. Ancheta Wis, can you start adding page numbers (assuming you have the books which are referenced in footnotes)? — Wackymacs (talk ~ edits) 16:50, 5 June 2008 (UTC)
Replaced the {{cite}} with {{Citation}}. Retained {{Ref patent}} on the recommendation of the Citations people. The notes now use {{harvnb}} Harvard-style references. --Ancheta Wis (talk) 06:46, 19 June 2008 (UTC)
- Looks good. Are you going to be adding page numbers to the books which are missing them? — Wackymacs (talk ~ edits) 07:37, 19 June 2008 (UTC)
- Thank you. No book which is in the Notes is missing page numbers, as far as I know. But when I unearth such information I will augment the article. Some books in the References section are there for cultural reasons, such as Bell and Newell, which is the single most important source, in my opinion. --Ancheta Wis (talk) 02:11, 20 June 2008 (UTC)
For the record I am aware that Lord Bowden's first name is not Lord. But I am forced into this by the strictures of the Citation system while using Harvard references. The Ref patent template also does not appear to play well with the References section. That is the reason that I have the 3 patent citations in a hybrid, one style for the Notes, and the Last, First names preceding the Ref patent template in the References section. --Ancheta Wis (talk) 12:12, 19 June 2008 (UTC)
SandyGeorgia, the harvnb templates still need last|year, but I notice that the 'last=' was missing from the Intel and IEEE. I restored the Citation|last=IEEE and then noticed that the Citation|last=Intel was changed as well. How is the Harvard-style referencing method going to work, in this case? --Ancheta Wis (talk) 01:38, 2 July 2008 (UTC)
First light
[edit]We need a name akin to the concept of first light of an observatory telescope; I propose the denotation first good run, and wish to apply it to Baby's first good run, June 21, 1948, 60 years ago. --Ancheta Wis (talk) 23:00, 21 June 2008 (UTC)
- I am wary of defining such "firsts" in computing, bearing in mind the statement in this article that "Defining a single point in the series as the "first computer" misses many subtleties".TedColes (talk) 16:42, 22 June 2008 (UTC)
- Thank you for your considered response. What I refer to is 'the comparison of an expectation to an observation', to use William Shockley's phrase. For example, there were 'screams of joy' when the first p-system for UCSD Pascal compiled itself (the expectation). In my mind, that qualifies as a first good run. Another might be the attainment of 1 peta-flop operation for IBM Roadrunner, just last month. For Baby, the resulting convergence of dots on the Williams tube to the expected location was the first good run. And since the phrase is ostensive, meaning relative to the situation, akin to 'baby's first word', I can see that what the proud parent might view as a triumph might be viewed as something more akin to Michael Faraday's response 'and what is the use of a new-born baby'. Might it be better to use a more prosaic phrase like 'first run'? --Ancheta Wis (talk) 19:20, 22 June 2008 (UTC)
- Herbert Simon once said 'There is no substitute for a working program'. Maybe the phrase might be 'first working program' for Baby. --Ancheta Wis (talk) 19:39, 22 June 2008 (UTC)
Harvard Mark I – IBM ASCC Turing Complete?
[edit]It seems like the table titled
"Defining characteristics of some early digital computers of the 1940s (See History of computing hardware)"
has a mistake. In the row about the Harvard Mark I – IBM ASCC in the column "Turing Complete" the link (1998) is clearly copied and pasted from the row about Z3. I don't know if Harvard Mark I was turing complete but the reference is wrong for sure. I am not familiar with the markup that references this table (obviously across multiple pages) and could not remove the information. Can someone else do it.
Stilgar (talk) 07:40, 25 June 2008 (UTC)
- The same reference to Rojas applies to both electromechanical computers, which ran from tape of finite length, and whose programs are of finite length. Rojas shows that it is possible to simulate branching with pre-computed unconditional jumps. That would apply to both Z3 and Mark I. --Ancheta Wis (talk) 08:36, 25 June 2008 (UTC)
I don't agree with extending the Rojas conclusion to another machine. Isn't it more complicated? It sounds like a piece of original research that hasn't been published. Zebbie (talk) 23:30, 22 August 2008 (UTC)
Rojas wrong about Turing Complete?
[edit]As a separate issue, I think Rojas' conclusion was wrong. Turing's most important contribution to computer science was to postulate "halting problem." Simply put, you can't tell how long a program will take to finish. Therefore Turing defined his Turing machine with the conditional branch. Rojas conclusion, again paraphrased, was: you can write a program without conditionals, but you have to make the tape as long as the program run time is.
1. Rojas is redefining a Turing machine to have no conditionals. I'd argue that is no longer a Turing machine. 2. Rojas' new machine has to know in advance how long the program will run. Turing would argue you cannot know this.
Zebbie (talk) 23:30, 22 August 2008 (UTC)
- The Rojas conclusion applies to jobs which include a while wrapper (code with a loop). The branches were needed to halt the program (the job) in any case. Otherwise the program could only terminate when the program encountered a HALT. A conditional branch to a location containing HALT would do this also. Such a program would stay in the potentially infinite loop until the operator manually terminated the job.
- Jump tables are a technique to accomplish branches.
- The length of time needed to complete a program can be known only to the programmer. I have had associates who had to re-submit jobs because the nervous operator terminated one which ran over 24 hours. But the program was correct and terminated by itself the next time after the operator let it run to completion. --Ancheta Wis (talk) 17:52, 23 August 2008 (UTC)
- On a related note, the 'carry' operation used in the most elementary calculators from centuries ago is a type of 'branch'. I learned this from Hennessey and Patterson's books on Computer Organization. --Ancheta Wis (talk) 13:44, 24 August 2008 (UTC)
Broken links
[edit]As it stands, this still doesn't meet the 2008 FA criteria standards. I just ran the link checker tool on this article, and found some broken links (many are used as references):
http://toolserver.org/~dispenser/cgi-bin/webchecklinks.py?page=History_of_computing_hardware
The broken links will need to be replaced with other reliable sources, preferably books. — Wackymacs (talk ~ edits) 07:53, 6 July 2008 (UTC)
Problems with References
[edit]At the moment, it seems page numbers are being given in the 'References' section instead of in the footnotes where they should be. — Wackymacs (talk ~ edits) 08:18, 6 July 2008 (UTC)
Why the special section?
[edit]Why is there a special section for 'American developments' and not one for 'British developments', or any other country? Are Americans special?
--Bias Detector-- 21st July 2008 —Preceding unsigned comment added by 86.9.138.200 (talk) 16:45, 21 July 2008 (UTC)
- See the article: "There were three parallel streams of computer development in the World War II era; the first stream largely ignored, and the second stream deliberately kept secret." 1)=Zuse 2)=secret UK 3)=ENIAC etc. --Ancheta Wis (talk) 18:14, 21 July 2008 (UTC)
Shannon's thesis
[edit]Claude Shannon founded digital design. Open any electrical engineering book and you will see what Shannon did. This is a link to his thesis. --Ancheta Wis (talk) 10:07, 27 January 2009 (UTC)
This isn't the same as "implementing" a circuit. However ground-breaking his thesis, it provided a proof, not an implementation. Follow the wikilinks. All we have is words to communicate here; we do need to be able to understand what they mean to make progress on this issue. --TraceyR (talk) 10:42, 27 January 2009 (UTC)"In his 1937 MIT master's thesis, A Symbolic Analysis of Relay and Switching Circuits, Claude Elwood Shannon 'proved' that Boolean algebra and binary arithmetic could be used to simplify the arrangement of the electromechanical relays then used in telephone routing switches, then turned the concept upside down and also proved that it should be possible to use arrangements of relays to solve Boolean algebra problems."
Thank you for taking this to the talk page, which I propose be the venue for improving the article: "In 1937, Shannon produced his master's thesis[61] at MIT that implemented Boolean algebra using electronic relays and switches for the first time in history." In this sentence, implemented refers to George Boole's work, which Shannon reduced to practice. Proof was established in the nineteenth century, before Shannon, by Boole. In other words, Shannon implemented Boole, with Boolean logic gates. In turn, successive generations of engineers re-implemented these logic gates in successive, improved technologies, which computing hardware has taken to successively higher levels of abstraction.
As a metaphor, take Jimbo Wales' statement of principle for Wikipedia. All successive editors implement Wales' vision. In the same way, Shannon implemented Boole.
If you have improvements to the article, I propose we work through them on the talk page. --Ancheta Wis (talk) 11:18, 27 January 2009 (UTC)
I think I see the disconnect: some things can be viewed as purely academic and theoretical; Boole's system of logic might be viewed in this light. But when Shannon expressed Boole's concepts in hardware (which had been done in an ad-hoc way earlier) he showed AT&T that there was another way to build the PSTN, which at one time was completely composed of humans doing the switching of telephone conversations. Today of course, this is all automated. So Shannon's accomplishment was essentially to provide an alternative vocabulary for the existing practice and mindset of the telephone company which in 1937 was analog circuitry. --Ancheta Wis (talk) 11:34, 27 January 2009 (UTC)
Here is a proposed sentence and reference:
- Claude Shannon showed there is a one-to-one correspondence between the concepts of Boolean logic and certain electrical circuits, now called logic gates, ubiquitous in digital computers. -- Claude Shannon, "A Symbolic Analysis of Relay and Switching Circuits", Transactions of the American Institute of Electrical Engineers, Vol. 57,(1938), pp. 713-723,
--Ancheta Wis (talk) 12:54, 27 January 2009 (UTC)
- That looks fine. Go with it. --TraceyR (talk) 10:26, 28 January 2009 (UTC)
I need to put in a plug for Emil Post's work. His formulation of the Turing machine is simpler and Post was actually earlier than Turing, but he failed to publish early enough. That is actually the reason I left in the 'and others'. But, c'est la vie. Maybe the Post-Turing machine will gain currency in future Category:Computational models. --Ancheta Wis (talk) 18:36, 28 January 2009 (UTC)
- Perhaps Post's "worker" can be regarded as a "machine" or perhps not. Either way, the evidence seems to point to Turing's 'On Computable Numbers' paper as having had considerable influence on subsequent developments. If you think it only fair to revise my edit, is 'others' (plural) the right word—maybe just refer to Post. I think there is a serious omission from the article in that it does not make any reference to Turing's Automatic Computing Engine design, which had important differences to von Neumann's 'First Draft' design. Incidentally, it is easy to underestimate the very close transatlantic co-operation during the second world war—Hodges says that Turing cited the von Neumann's paper in his own 1945/46 ACE paper.TedColes (talk) 23:06, 28 January 2009 (UTC)
- I have No Problem with your edits as I respect your work. Perhaps we can also use the first-hand memoirs from First-Hand:History of the IEEE Global History Network to entice more editors to contribute here. --Ancheta Wis (talk) 01:47, 29 January 2009 (UTC)
- I was not aware of Networking the History of Technology—I am not a member. But it looks like a potentially excellent and authoritative source. TedColes (talk) 06:56, 29 January 2009 (UTC)
- I have No Problem with your edits as I respect your work. Perhaps we can also use the first-hand memoirs from First-Hand:History of the IEEE Global History Network to entice more editors to contribute here. --Ancheta Wis (talk) 01:47, 29 January 2009 (UTC)
Shannon and Stibnitz
[edit]Since Stibnitz is mentioned in the same paragraph as Shannon, there is a suggestion that Stibnitz's work was based on Shannon's thesis. If this is the case, perhaps this should be stated explicitly (and mentioned in the Stibnitz article too). If not, maybe a new paragraph is needed. --TraceyR (talk) 14:02, 29 January 2009 (UTC)
- Or, since they both worked for Bell labs, connect with more text.
- A new paragraph would be less work. --Ancheta Wis (talk) 15:03, 29 January 2009 (UTC)
- If Stibnitz knew of Shannon's thesis and used it in his work, the article ought to reflect this. Is there citable evidence to enable this link to be made? That both worked for Bell is certainly circumstantial evidence, but is it enough to make the link?--TraceyR (talk) 15:21, 29 January 2009 (UTC)
Hatnote mess
[edit]Many of the section hatnotes are a little non-sequitorous. Others "belong" in other sections. I don't have the time to sift through them all myself though. –OrangeDog (talk • edits) 18:37, 29 January 2009 (UTC)
Voltages ... were ... digitized
[edit]The lead summary states: "Eventually the voltages or currents were standardized, and then digitized". Could someone explain how voltages or currents were digitized. In what way(s) was this breakthrough made? I thought that my PC used 'analogue' power. Many thanks. --TraceyR (talk) 07:42, 30 April 2009 (UTC)
- You can look up the voltages in the successor to the TTL databook. The logic series was 5400, then 7400, then 4000, etc. The 1970s 7400 Low power: "1.65 to 3.3V". We need an article about this, from 28V for relays, successively lower as power consumption became greener. Maybe WtShymanski can step in? --Ancheta Wis (talk) 21:25, 30 April 2009 (UTC)
- When looking up DTL (1961) I see the levels were -3V and ground. So you can see the voltages were digitized from the beginning. --Ancheta Wis (talk) 00:41, 1 May 2009 (UTC)
- Here is a handy table for the different logic families. --Ancheta Wis (talk) 00:50, 1 May 2009 (UTC)
- If anyone can tell me what that paragraph is supposed to be saying, I'll buy him/her a donut. The whole lead is garbage and must be rewritten. For every Von Neumann chip out there there's probably a half-dozen Harvard-style chips - let's not lie excessively in the first paragraph. --Wtshymanski (talk) 18:49, 24 September 2009 (UTC)
- Ever notice how a perfectly clear Wikipedia article, by gentle stages, eventually becomes something that looks like the transcript of the speech of a cat lady having a bad day? One's confidence in the ever-upward spiral of Wikiprogress is shaken. List all the synonyms, show how it's spelled in different varities of English, and, perhaps, include a diatribe on how it wsa *really* invented by Tesla/a Hungarian/a Canadian/an ET - put all that in the first sentence with enough links and footnotes, and you're well on the way to mania. --Wtshymanski (talk) 20:17, 24 September 2009 (UTC)
- If anyone can tell me what that paragraph is supposed to be saying, I'll buy him/her a donut. The whole lead is garbage and must be rewritten. For every Von Neumann chip out there there's probably a half-dozen Harvard-style chips - let's not lie excessively in the first paragraph. --Wtshymanski (talk) 18:49, 24 September 2009 (UTC)
I appreciate ArnoldReinhold's edits; they show that the flat memory model is a definite advance on the delay line memory model that early programmers had to deal with; however the current style of programming did not arise from nothing. If the deleted edits were unclear, then we might have to give an example of the contortions that programmers had to go through when solving a problem in the early days. Hardware-independent programming did not exist in the early days. Even today, operating system-independent programming is not a given: the API is typically OS dependent. In the absence of contributions to the article in this vein, consider how one would have to program if the items in memory were to decay before they were reused -- one would be forced to refresh critical data before the delay time had elapsed. --Ancheta Wis (talk) 19:01, 24 May 2009 (UTC)
- You seem to be implying that refreshing memory was the programmer's responsibility, which it wasn't. A better example might be the programming contortions required to access the early magnetic drums. --Malleus Fatuorum 19:07, 24 May 2009 (UTC)
- That was the point of the deleted text (on accessing the magnetic drums). --Ancheta Wis (talk) 21:37, 24 May 2009 (UTC) rvv --Ancheta Wis (talk) 05:25, 1 September 2009 (UTC)
Introduction
[edit]I reached this article looking for a reference to the MOSAIC computer (Ministry of Supply Automatic Integrator and Calculator) and wondered if the following Introduction might be short enough and apposite:
Computing hardware subsumes (1) machines that needed separate manual action to perform each arithmetic operation, (2) punched card machines, and (3) stored program computers. The history of (3) relates largely to (a) the organization of the units to perform input and output, to store data and to combine it, into a complete machine (computer architecture), (b) the electronic components and mechanical devices that comprise these units, (3) higher levels of organization into 21st century supercomputers. Increases in speed and memory capacity, and decreases in cost and size in relation to compute power, are major features of the history.
Five lines instead of 36. The present Introduction could become the first section, headed say Overview, and the pre-stored program coverage extended to mention the abacus, the National Accounting Machines that "cross footed" under control of a "form bar" that facilitated table construction using difference methods, and machines of mid 20th century typified by the Brunswiga (not sure of spelling) and Marchand. The overlap of punched card and stored program computers, by dint of control panels and then card programmed computers could be mentioned. Michael P. Barnett (talk) 01:47, 24 December 2010 (UTC)
- Used your 5-line suggestion for the Introduction. Please feel free to incorporate the remainder of your contribution into the article. Thank you for your suggestions. --Ancheta Wis (talk) 11:54, 26 December 2010 (UTC)
Argument at IEEE 754-1985
[edit]There is currently a slow edit war at IEEE 754-1985. I put down the Z3 as the first working computer as is in this article and it was reverted. I pointed out this article as a better venue to argue matters about history but they can't be bothered to do that so I'm doing it instead. Discussion at Talk:IEEE 754-1985#Z3 first working computer. 17:50, 8 February 2011 (UTC)
Punched cards derived from Jean baptist Falcon (1728)
[edit]Please put in a note, that the idea of punched card driven looms originated from french mechanic Jean Baptist Falcon in 1728, although Falcon never successed in building one by himself. —Preceding unsigned comment added by 91.97.182.235 (talk) 15:12, 13 February 2011 (UTC)
- I can't see why, they were a development of the perforated paper rolls being used for the purpose and he didn't make it work. Who used the perforated paper rolls first and when would be more relevant. Also relevant at this level of detail possibly would be the barrels with pins which were used before that for controlling automatons, and as far as I know Hero of Alexandria used them first. Dmcq (talk) 19:07, 13 February 2011 (UTC)
Transition from analog to digital
[edit]I propose to rename the analog section in order to preserve the content that was removed.
Alternatively, perhaps a new section with this name might be inserted to contain that content. --Ancheta Wis (talk) 11:15, 5 May 2011 (UTC)
- The business about accuracy is practically irrelevant. Digital computers are more convenient. It is like the difference between solving geometry problems the Greek way and solving them using Cartesian coordinates. The Cartesian coordinates may be more long winded in some cases but they just work. Dmcq (talk) 11:09, 9 May 2011 (UTC)
- Noise is relevant. A usable signal to noise ratio is the fundamental reason that digital circuits are more accurate than analog circuits. --Ancheta Wis (talk) 11:22, 9 May 2011 (UTC)
- You mean precision, not accuracy. Doesn't matter how many bits you have in the number if it's the wrong number. --Wtshymanski (talk) 13:21, 9 May 2011 (UTC)
- Yes, you are quite right about the distinction. Thank you. --Ancheta Wis (talk) 13:58, 9 May 2011 (UTC)
- I believe the original idea behind ENIAC was that it should emulate a differential analyser but that idea was abandoned early on as lacking in vision. Even Ada Lovelace and Babbage knew better Dmcq (talk) 17:12, 9 May 2011 (UTC)
- Yes, you are quite right about the distinction. Thank you. --Ancheta Wis (talk) 13:58, 9 May 2011 (UTC)
- You mean precision, not accuracy. Doesn't matter how many bits you have in the number if it's the wrong number. --Wtshymanski (talk) 13:21, 9 May 2011 (UTC)
- Noise is relevant. A usable signal to noise ratio is the fundamental reason that digital circuits are more accurate than analog circuits. --Ancheta Wis (talk) 11:22, 9 May 2011 (UTC)
Was the Harvard Mark I "Turing Complete"? -- Revisited
[edit]We currently label the Mk I as NOT Turing complete - presumably because of a lack of jump instructions. There was some discussion of this on this talk page back in 2008.
It must be noted that:
- Turing completeness says that a machine is considered to be Turing complete if it can emulate a Turing complete machine.
- One instruction set computer points out that a machine that implements nothing more than the 'SUBLEQ' instruction is Turing complete.
- Harvard Mark I says that the Mk I could run from a looped paper tape - so even without a formal jump instruction, it could run the same set of instructions over and over indefinitely.
- The following program demonstrates that it is possible to emulate a SUBLEQ machine with code inside a single 'while(1)' loop - which the Harvard Mark I could have implemented via paper tape looping:
// Initialization: typedef unsigned char byte ; int lut [ 256 ] = { 1, 1, 1, 1, 1, 1, 1, .... // 128 ones. 0, 0, 0, 0, 0, 0, 0, .... // 128 zeroes. } ; byte mem [...whatever...] = { ...whatever... } ; // The initial state of memory in the SUBLEQ machine int PC = 0 ; // The SUBLEQ machine's program counter.
// Runtime: while ( 1 ) // (Implemented via a paper tape loop) { // Read instruction operands from the program counter location. int a = mem[PC++] ; int b = mem[PC++] ; int c = mem[PC++] ; // Perform subtraction: mem[b] -= mem[a] ; // Use lookup table to extract sign of mem[b] so that: // c is multiplied by 1 and added to the program counter if mem[b]<=0 // c is multiplied by 0 and added to the program counter if mem[b]>0. PC += lut[mem[b]+128] * c ; }
Ergo, the Harvard Mark I was indeed Turing Complete. This is rather important IMHO. SteveBaker (talk) 15:26, 3 May 2012 (UTC)
- Why is 'Turing completeness' important? It is not synonymous with 'general purpose' - and that certainly would not be claimed for the Harvard Mark I. --TedColes (talk) 15:43, 3 May 2012 (UTC)
- It's important because the Church-Turing thesis says that all computers that are turing complete are equivalent (given enough time and memory). If the Mk I is Turing complete - then (with enough time and memory) it could emulate any modern computer - so we'd have to say that it should be considered to be "general purpose". Turing completeness is what truly separates the modern concept of the word "computer" from some mere calculator or sequencer. SteveBaker (talk) 16:52, 3 May 2012 (UTC)
- It seems that you are correct. However, we have a problem. Here in Wikipedia, we can't publish original research. What we need is a reliable source that claims this (or the contrary). ---- CharlesGillingham (talk) 20:02, 3 May 2012 (UTC)
- I'm painfully aware that this discovery is my own WP:OR - and therefore problematic without reliable sources. However, the entire section History_of_computing_hardware#Early_computer_characteristics has not one single reference - so why should this article have an unreferenced falsehood rather than an unreferenced truth? We do state as a fact that the Mk I is not Turing complete - and that is stated without references. Per WP:V we can only do that if this statement is uncontroversial. Well, following my reasoning, it most certainly is controversial because both you and I agree that it's untrue. Hence until/unless we can find a WP:RS we have three alternatives:
- Leave the article as it is - with an unreferenced, controversial (and seemingly false) statement.
- Change the article to say that the Mk I is indeed Turing complete - leaving an unreferenced (but evidently true and hopefully uncontroversial) statement.
- Remove that table (or at least the "Turing complete" column or the "Mk I" row) on the grounds that it is "is likely to be challenged" and has no WP:RS to back it up (per WP:V).
- I don't think (1) is acceptable - so we either need to change the (unreferenced) "No" to an equally unreferenced "Yes" - or nuke the table (per WP:V) on the grounds that it's both un-sourced and controversial. Ideally of course we should find a reliable source - but until we do, the article shouldn't contain an unreferenced statement that we now know to be false.
- SteveBaker (talk) 12:46, 7 May 2012 (UTC)
- I'm painfully aware that this discovery is my own WP:OR - and therefore problematic without reliable sources. However, the entire section History_of_computing_hardware#Early_computer_characteristics has not one single reference - so why should this article have an unreferenced falsehood rather than an unreferenced truth? We do state as a fact that the Mk I is not Turing complete - and that is stated without references. Per WP:V we can only do that if this statement is uncontroversial. Well, following my reasoning, it most certainly is controversial because both you and I agree that it's untrue. Hence until/unless we can find a WP:RS we have three alternatives:
- Turing completeness is clearly controversial, so I would favour removing that column form the table. The nuclear option of deleting the whole table seems extreem, particularly as the transcluded form has been removed from a whole host of articles. As regards the lack of references, readers can look to the articles about the individual machines. --TedColes (talk) 17:04, 7 May 2012 (UTC)
I think the Turing completeness column is useful to our readers as a rough guide to how the technology evolved. The controversial entries should have a footnote that says later researchers have attempted to show the machines in question were Turing complete but those capabilities were not envisioned when the machines were developed and used. --agr (talk) 10:39, 9 May 2012 (UTC)
- Only if all the entries can be verified from independant sources, and not original research, should this column be retained.--TedColes (talk) 11:38, 9 May 2012 (UTC)
- This table only useful as a "rough guide" if it actually contains true facts. Before I edited it, the article said that the Mk I is definitely not Turing complete - which was clearly false. That's not a "useful rough guide" - it's a misleading untruth!
- The historical matter of whether the machine's developers were trying to make the machine Turing complete is moot because the Church-Turing thesis wasn't widely accepted or its implications understood until Kleene's paper was published in the early 1950's...six or more years after the Harvard Mk I entered service. Before Church-Turing, it really didn't seem to matter a damn whether a machine was Turing complete or not because nobody knew that Turing-completeness was the key to making a fully general-purpose computer. They couldn't have known how important that is - and therefore were unlikely to build specific features into their machines to ensure that it crossed that threshold. It's not like researchers were pushing steadily towards Turing-completeness - so the column of Yes's and No's doesn't really exhibit a trend in the design of early computers.
- Neither I, nor WP:V have any problem with putting unsourced material into the encyclopedia provided that it's not controversial. You don't need to find sources for "The sky is blue", "2+2=4" or "My laptop is Turing complete". But as soon as a statement becomes controversial, you either have to find references for it or remove it. Personally, I'm 100% convinced that the Harvard Mk I was Turing complete - and IMHO our article wasn't just controversial, it was downright wrong. But my argument alone should suffice to convince everyone that the statement that the Mk I is not Turing complete is at the very least controversial. So no matter what, the article can't say that.
- The decision then comes down to either:
- If everyone accepts my argument (above) - then a "Yes" next to the Harvard Mk I isn't controversial - and we can change the article to say that without a reference (although that would still be nice to have)
- ...OR...
- One or more people here disagree with (or don't understand) my argument - so the table is controversial whether it says "Yes" or "No". Since it's unreferenced material - it must be deleted in order to resolve the controversy.
- SteveBaker (talk) 13:21, 9 May 2012 (UTC)
- If we don't know we should just put in a dash, we don't have to say yes or no. I know some people just can't stand uncertainty so will argue forever about grey things like that and personally I'm no fan of the Turning column so I wouldn't miss it. The real point is that people couldn't be bothered with anything like that, Zuse for instance wanted to produce programmable scientific calculators that individual engineers or small groups could use, for that price was a main constraint. Colossus was built to crack codes. Universality just wasn't one of the things the early pioneers were interested in. You compare them against the Manchester Baby which was easily universal but totally impractical and built just to test out some ideas especially the Williams tube memory. Universality doesn't require much as can be seen from the game of Life, I think the Baby can be celebrated as the first computer with a modern structure having a stored program rather than all the configuring of ENIAC which was an automated bunch of electronic tabulators in effect. If anything I'd put down the main innovation in them or what they were for rather than the Turing completeness column. Perhaps change the 'Programming' column to 'Description' and add under the Baby for instance "Testbed for Williams tube memory. First stored program computer." Dmcq (talk) 16:53, 9 May 2012 (UTC)
Flamm citations
[edit]To anon 86.177.118.203: I patched in a phrase in the new footnote 1 which I hope matches your intent. Please feel free to alter my patch to your contribution. --Ancheta Wis (talk | contribs) 03:25, 25 January 2013 (UTC)
In the same light, I propose to use 'accelerated' rather than 'underpinned' in your contribution because the article makes it clear that there were funding sources other than military contract, in both US and Germany. I do not deny that IC-based computers in military systems (1958-1960s) were materially funded by US (& likely USSR) contracts. --Ancheta Wis (talk | contribs) 04:06, 25 January 2013 (UTC)
- Sorry, going to be a pain! With regard to the USSR, I feel the word underpinned to describe government involvement is already an understatement; I feel underpinned is also the appropriate term to use for development elsewhere. Also the sources say that the investment from the private sector pales into insignificance when compared the resources ploughed in from government. Just so it's not my word (all quotes below are from reviews of Flamm's studies): "As Flamm points out, one of the crucial features of the computer was the role played by government and universities in the early stages of research and development when a great deal of 'Knightian' uncertainty (as opposed to risk) deterred private companies from significant commitment. ... [In Japan,] the Ministry of International Trade and Industry was crucial". An "insignificant" commitment from the private sector, according to the sources cited, for early stages of computer development and the computer market. According to Flamm, who, at least in my understanding, is my understanding of what we must accurately represent, governments more than "accelerated" the development and commercial viability—it wouldn't have happened without them. "the U.S. government, especially its military arm, has played a crucial role at critical times in bringing the computer out of the experimental stage to its present strong position in the marketplace". Again: "the government's multifaceted involvement ... [included that of] research sponsor, principal customer, and regulator of computing technology". And again: "government support is crucial because of the financial disincentives for private investors to be involved in long-term Research and Development". So I'm cheering for a slightly more emphatic term than accelerate, at least for early development and the creation of a viable market!
86.177.118.203 (talk) 00:06, 26 January 2013 (UTC)- I appreciate your response, and have reverted my wording. Thank you for your precis of the Flamm works.
- Computing, IC engineering, Arpanet, Quantum cryptography, and so forth, would look very different with different/ alternative funding histories. And these topics are germane to the article. -Ancheta Wis (talk | contribs) 00:47, 26 January 2013 (UTC)
categories
[edit]What category (or categories) is appropriate for machines that use integrated circuits, but don't put the entire processor on a single chip? In other words, what category covers what History of computing hardware (1960s–present) calls "Third generation" computers?
In other words, what category goes in the blank of the following?:
- The category: electro-mechanical computers or category: vacuum tube computers include articles about "First-generation machines"
- The category: transistorized computers includes articles about "Second generation" computers.
- The category ----?---- includes articles about "Third generation" computers, such as all the computers mentioned in the book The Soul of a New Machine.
- The category: microcontrollers and category: microprocessors (and its many sub-categories) cover single-chip processors -- the processors used in "Fourth generation" computers.
--DavidCary (talk) 14:55, 23 August 2013 (UTC)
- Perhaps category: minicomputers ? --DavidCary (talk) 14:55, 23 August 2013 (UTC)
The category: minicomputers covers a many of them, but it doesn't cover other multi-chip processors such as the Apollo Guidance Computer, the Cray-1, the first hardware prototype of the Motorola 6800, etc.
Should we start a new category, perhaps category: integrated circuit processors? --DavidCary (talk) 14:55, 23 August 2013 (UTC)
Archimedes' method
[edit]Archimedes' method of performing calculations was the use of mechanical balance (think see-saw) of countable items versus the object being measured. This method was used for estimating the number of grains of sand in the universe, etc. (see the the sand reckoner).
Thus Archimedes' method of calculation was very concrete, as befits his status as engineer, inventor, and physicist. For this reason I propose to add his method to history of computing rather than to this article. I am pretty sure there is already a main article about this. --Ancheta Wis (talk | contribs) 02:19, 30 September 2013 (UTC)
- That sounds reasonable to me. Bubba73 You talkin' to me? 02:28, 30 September 2013 (UTC)
Claim that Zuse is commonly known as the "inventor of computer" is wrong.
[edit]The lede previously claimed that Zuse was commonly known as *the* "inventor of the computer" and the only citations given are to discussions in blogs. Published histories of computing have variously proposed that the "inventor of the computer" is Babbage (who designed the first programmable computer), Aiken (for the Harvard Mark1 which was a highly influential electromechanical computer designed and built around the time of Zuse's Z3), Atanasoff (for the first electronic digital computer), Eckert and von Neumann (for the stored program concept), and several other milestones. Zuse's Z3 could certainly could support the claim of his being the creator of the first working electromechanical programmable computer, but this does not imply that he is commonly known as the inventor of the computer. Wikipedia articles should not be used to push non-mainstream views.
For now I have moved this claim down to section on Zuse's computer for now, but I think that either a separate section discussing the complex issue of who was *the* inventor of the computer should be added, or this claim should be removed (in any case, the claim needs reputable citations, not just blogs). 198.255.141.250 (talk) 16:33, 22 December 2013 (UTC)
changes
[edit]Hi, the article is rather chaotic and unorganized. It's very difficult for a casual reader to make sense of the important developments and stages. There's is also lots of important information that is missing.Noodleki (talk) 19:19, 7 January 2014 (UTC)
- Noodleki, Thank you for responding! Now, using the WP:BRD protocol, I propose reverting myself, and adding inline tags to indicate what ought to be worked out?
- To all editors, comments on my proposal? In other words, start with Noodleki's changes, and tag Noodleki's edits with concerns.
- For example, I think it is POV to call the earliest known computing devices primitive.
- The invention of zero isn't even marked in the article, and zero was momentous, in my opinion.
- The recognition that the carry operation was a form of branching...
- The upcoming quantum computers are only briefly mentioned, etc., etc.
- Software is only tangentially mentioned. ...
- Or ... some other proposal ...
- Such as agreeing on an outline of the changes? --Ancheta Wis (talk | contribs) 20:35, 7 January 2014 (UTC)
Hi, I understood from the above that you would revert. I think your suggestions equally apply to the version as it stands, although I think software wouldn't necessarily come under this article's purview. Thanks.Noodleki (talk) 21:20, 8 January 2014 (UTC)
- Noodleki, you are welcome to interpolate inline tags, or other comment on the talk page. Regarding your vision of the article, I would be interested in exactly what missing items you are noting. The development of the hardware follows the published history, for example. A retrospective view necessarily distorts what actually happened. If we were to follow Babbage's dream, for example, we would have seen steam-powered computation. But that is not the way computation actually developed. --Ancheta Wis (talk | contribs) 01:02, 9 January 2014 (UTC)
- I'm afraid I don't understand what you mean about inline tags. You said above that you propose to revert yourself, but you don't seem to be doing this. The changes in the article are layout improvement and better organization of material, and more information on key developments such as Babbage and Colossus.Noodleki (talk) 11:31, 9 January 2014 (UTC)
- The WP:BRD protocol requires a discussion - the reverter should explain the reasons behind his/her revert, which is something you aren't doing. Your suggestions apply equally to the article as it stands, and I've already explained the basis for my changes. You also agreed earlier to revert it yourself, and I don't understand why you are not doing this.Noodleki (talk) 11:33, 12 January 2014 (UTC)
- I'm afraid I don't understand what you mean about inline tags. You said above that you propose to revert yourself, but you don't seem to be doing this. The changes in the article are layout improvement and better organization of material, and more information on key developments such as Babbage and Colossus.Noodleki (talk) 11:31, 9 January 2014 (UTC)
Noodleki, I am waiting for the other editors to respond. Your changes for Babbage fit nicely in the nineteenth c. and I suggest that you add them to that section. However I do not agree with your characterization of 'chaotic' and suggest to you that there is a logical flow in the article already. It goes a bit far to place as much emphasis on Babbage as your version, as his design required repeatable manufacturing tolerances beyond the capacities of the nineteenth c. It took another century. __Ancheta Wis (talk | contribs) 12:00, 12 January 2014 (UTC)
- I think Babbage is underemphasized. After all, he was the first to describe a proper computer. '1801: punched card technology 1880s: punched card data storage' is a very strange set of sections and there is far too much emphasis on Hollerith, who's invention was a simple calculating device, similar to Pascal's invention. The article also lacks a 'flow' - it's very disjointed and doesn't explain clearly the important stages. The layout could be greatly improved, the intro shortened, the last section removed as there is a dedicated article for it already. There is also little information on analog computers. All these deficiencies were removed with my edit.Noodleki (talk) 15:06, 12 January 2014 (UTC)
- Babbage's work is one stage in the history of computing hardware. There is more to computing than Babbage. You are welcome to flesh out Babbage's role, but he is not center stage today. The current article states clearly that the pace of development is unabated. --Ancheta Wis (talk | contribs) 04:42, 14 January 2014 (UTC)
- Here is an example of an inline tag.[discuss] --Ancheta Wis (talk | contribs) 04:55, 14 January 2014 (UTC)
- As an example of the pace of computing hardware development, there are multiple streams of development for qubits which are in progress right now. There is no clear winner for implementation, philosophical explanation, or technological exploitation yet. But large amounts of money are being risked right now, as in the Babbage case. IBM is taking yet another path, to make things even more interesting. __Ancheta Wis (talk | contribs) 12:57, 14 January 2014 (UTC)
- I'm not suggesting Babbage is 'center-stage'. I don't know why you bring up qubits - that could go in the Post-1960 article. Anyway, you still haven't provided an explanation for your revert, and you haven't reversed it, despite saying you would. So, I will provisionally put those changes back in, and you can point out problems that you might have with inline citations. ? . Noodleki (talk) 12:04, 16 January 2014 (UTC)
vacuum tube computers
[edit]Is there a Wikipedia article dedicated to vacuum tube computers?
I think there's enough material in this article about vacuum tube computers to create an article (WP:SPINOUT) focused on that category of computers.
Usually when there exists both a Wikipedia category about some topic, and also a Wikipedia "List of" article about that same topic, there is usually an WP:EPONYMOUS article dedicated to exactly that topic.
For example, there is both a list of transistorized computers article and a category: transistorized computers, so I am glad to see there is also a transistor computer article.
I see there is both a list of vacuum tube computers article and a category: vacuum tube computers, so I am surprised that there is apparently no article dedicated to vacuum tube computers.
When I click on vacuum tube computers, hoping to find an article dedicated to them, today I find it is a redirect to vacuum tube, which has much less information (mostly in vacuum tube#Use in electronic computers) about such machines than this "History of computing hardware" article.
Is there an article that more specifically discusses vacuum tube computers that vacuum tube computer and vacuum tube computers should redirect to? --DavidCary (talk) 18:37, 28 May 2015 (UTC)
- I think that there definitely should be such an article. The only article I only know of (and find) List of vacuum tube computers, which you already know about. Stored-program computer is also relevant, but doesn't have enough information. Bubba73 You talkin' to me? 02:34, 29 May 2015 (UTC)
Wilbur machine
[edit]Hi,
Does the Wilbur machine (analog computer, on display in the science museum in Tokyo) fit in the history of (analog) computers or did it have any significans? --Butch (talk) 13:40, 22 November 2015 (UTC)
- @Butch: Wilbur could solve up to 9 simultaneous linear equations, but was not programmable for other applications,[1] such as the orbit of Mars, or even the law of falling bodies. It was 'hard-coded', so to speak, and thus inflexible, compared to software-programmable devices. --Ancheta Wis (talk | contribs) 14:38, 22 November 2015 (UTC)
- Thanks for the responses. The in the article mentioned Sir William Thomson device was as far as i can see, also not programmable (only 'settable'). So my question remains, should the Wilbur machine be mentioned in the article? (Notes 1) Maybe not under 'analog computers' but under 'early devices'? 2) In the added reference to MIT the keyword 'analog computer' is used! 3) Maybe the Wilbur Machine should have it's own lemma. Anybody from MIT?) BTW Ancheta Wis, have you ever seen a software programmable analog computer?--Butch (talk) 08:18, 23 November 2015 (UTC)
- Analog machines are physical configurations. As arrangements of physical objects, they obey physical laws, such as the mass flow of water, or metal balls, for example. The mathematical solutions in some nomogram, say a Smith chart, are typically mathematical transformations which are geometrical shapes, not mathematical equations. So no, the analog machines are not software, they are typically hardware, used to embody a specific mathematical operation. (Think slide rule or planimeter.) By the way, a quantum computer, using some configuration of qubits would also embody some quantum mechanical experiment, as the analog for something else. --Ancheta Wis (talk | contribs) 09:25, 23 November 2015 (UTC)
- Thanks for explaining the analog computer but i happen already to know what an analog computer is (During my training i did wire/setup an analog computer to simulate a moon landing! For some years i also was a maintenance technician for an analog computer of a missile guidance system!) Also adding quantum mechanics does not answer my question. So my main question remain? Should the Wilbur machine be mentioned in this article (or maybe an article at its own?--Butch (talk) 09:45, 23 November 2015 (UTC)
- Since this is a wiki, you are free to contribute to the article. --Ancheta Wis (talk | contribs) 12:55, 23 November 2015 (UTC)
- Thanks, Ofcourse i know i can add text to the article (done it before ;-) I just ask for other peoples opinion whether the Wilbur machine is worth mentioning in this article. Seems a simple question to me.--Butch (talk) 13:04, 23 November 2015 (UTC)
- Since this is a wiki, you are free to contribute to the article. --Ancheta Wis (talk | contribs) 12:55, 23 November 2015 (UTC)
- Thanks for explaining the analog computer but i happen already to know what an analog computer is (During my training i did wire/setup an analog computer to simulate a moon landing! For some years i also was a maintenance technician for an analog computer of a missile guidance system!) Also adding quantum mechanics does not answer my question. So my main question remain? Should the Wilbur machine be mentioned in this article (or maybe an article at its own?--Butch (talk) 09:45, 23 November 2015 (UTC)
- Analog machines are physical configurations. As arrangements of physical objects, they obey physical laws, such as the mass flow of water, or metal balls, for example. The mathematical solutions in some nomogram, say a Smith chart, are typically mathematical transformations which are geometrical shapes, not mathematical equations. So no, the analog machines are not software, they are typically hardware, used to embody a specific mathematical operation. (Think slide rule or planimeter.) By the way, a quantum computer, using some configuration of qubits would also embody some quantum mechanical experiment, as the analog for something else. --Ancheta Wis (talk | contribs) 09:25, 23 November 2015 (UTC)
- Thanks for the responses. The in the article mentioned Sir William Thomson device was as far as i can see, also not programmable (only 'settable'). So my question remains, should the Wilbur machine be mentioned in the article? (Notes 1) Maybe not under 'analog computers' but under 'early devices'? 2) In the added reference to MIT the keyword 'analog computer' is used! 3) Maybe the Wilbur Machine should have it's own lemma. Anybody from MIT?) BTW Ancheta Wis, have you ever seen a software programmable analog computer?--Butch (talk) 08:18, 23 November 2015 (UTC)
Probably, a separate article would fare better, along the lines of the Atanasoff–Berry computer, which attacked the same application (systems of linear equations). Or, a contribution to System of linear equations, including both Atanasoff–Berry computer and Wilbur machine would add interest to a math article. --Ancheta Wis (talk | contribs) 13:14, 23 November 2015 (UTC)
@Butch, I see that Google's quantum computer from D-wave is also a hard-coded device. That is, it embodies some quantum-mechanical experiment. In Google's case, it was quantum annealing. So we are back to the limitations of the Wilbur machine; like the Wilbur machine, the current Google machine is not general purpose, even though it ran 10^8 times faster[2] than a conventional computer working on the same problem,[3] simulated annealing. --Ancheta Wis (talk | contribs) 15:53, 9 December 2015 (UTC)
References
- ^ Wilbur machine, 1930s, MIT accessdate=2015-11-22
- ^ However IBM researchers have previously projected no improvement using quantum annealing
- ^ "...Google says [the D-Wave 2X's] quantum annealing can outperform simulated annealing on a single-core classical processor, running calculations about 10^8mtimes faster" Wired, 11 Dec 2015
Ars Magna?
[edit]Hi
any particular reason Lull's Ars Magna is not included (or at least referenced) here?
T88.89.219.147 (talk) 23:50, 17 May 2016 (UTC)
undefined reference "Robinson" in portion related to Colossus.
[edit]There is an undefined reference to "Robinson" in the portion related to Colossus. — Preceding unsigned comment added by 146.18.173.105 (talk) 19:06, 8 June 2016 (UTC)
- I had to fall back to find a version of the text without the fragment "Both models were programmable using switches and plug panels in a way the Robinsons had not been." dated January 2014. Might this be what you refer to? --Ancheta Wis (talk | contribs) 21:09, 8 June 2016 (UTC)
- In fact this sentence shows up in the very next edit. --Ancheta Wis (talk | contribs) 21:29, 8 June 2016 (UTC)
- Here is a citation for the tape drives (the Robinsons). --Ancheta Wis (talk | contribs) 00:55, 9 June 2016 (UTC)
The Ishango bone
[edit]There is a picture of this artifact but no mention of it in the text. Such an arrangement is not helpful. Kdammers (talk) 17:53, 12 September 2016 (UTC)
- I propose the caption and text "The Ishango bone is thought to be an early tally stick.".[1] --Ancheta Wis (talk | contribs) 01:09, 13 September 2016 (UTC)
- ^ The Ishango bone is a bone tool, dated to the Upper Paleolithic era, about 18,000 to 20,000 BC. It is a dark brown length of bone, the fibula of a baboon. It has a series of tally marks carved in three columns running the length of the tool. It was found in 1960 in Belgian Congo. --A very brief history of pure mathematics: The Ishango Bone University of Western Australia School of Mathematics – accessed January 2007.
- How about "prehistoric" or "paleolithic" instead of "early"? Kdammers (talk) 14:15, 13 September 2016 (UTC)
Done --Ancheta Wis (talk | contribs) 21:36, 18 September 2016 (UTC)
Stored-program computer & MESM
[edit]I assume that MESM, "the first universally programmable computer in continental Europe", that is, present-day Ukraine, should be added to History of computing hardware, before EDVAC. That or EDVAC removed from that section, since it's unclear how it contributes anything there. Or maybe both.
Ilyak (talk) 05:24, 13 March 2017 (UTC)
- @Ilyak, Thank you for your note. In the process of tracing the extant entries in the article, I noticed that one of the Xtools entries for the listed computers failed to activate. To me, this indicates that the entry's article is so little visited that Xtools had not yet built up a persistent item in its stores. While revisiting MESM and other Soviet-era entries, the Xtools report for one of them finally popped up for me, but it took several visits to the entry.
- What your Good Faith contribution highlights is the nature of the community of editors. We write what we know about. I personally learned of the Strela computer from a 1970s era IEEE Spectrum entry, while others contributed what they knew, such as MESM. It is my observation that our sources for our articles are gathered in the same way -- organically, step by step. Students from Iowa learn about ABC, students from Ukraine learn about MESM, students from UK learn about Turing, students from Penn learn about von Neumann, and so forth.
- While adding entries, I used Gordon Bell's Computer Structures, but he had never heard of the Bletchly Park machines, so they were not in this article, at first. In the same way, although there is a Soviet-era computing article, it will take work to add these entries. The article strives for completeness, so MESM ought to be on it. I invite your contribution.
- What this article traces is an evolution, from marks on sticks and clay, to dedicated hardware, first powered mechanically, then electro-mechanically, electrically, electronic, and beyond (artificial neural, qubit, etc.). We add what we know. I invite you to do so. --Ancheta Wis (talk | contribs) 09:49, 13 March 2017 (UTC)
Zuse
[edit]I have been curious how IBM had such good knowledge of Zuse. Perhaps a history which details Dehomag can clarify this. --Ancheta Wis (talk | contribs) 18:12, 23 August 2017 (UTC)
- Based on this citation I am interested in the connection between Willy Heidinger and Konrad Zuse. --Ancheta Wis (talk | contribs) 21:18, 23 August 2017 (UTC)
"History is written by the victors" —Anonymous
[edit]This quotation is a paraphrase of Machiavelli, The Prince, ch. XVIII:
—ch. XVIII
Since we are seeing a revert war, might we consider:
- What good is it to rile up the editors of the article. What purpose is served by trolling the article? There are policies against this.
- "Clausewitz had many aphorisms, of which the most famous is 'War is the continuation of politics by other means.' " Might we think the Holocaust was war, but begun 10 years earlier?
- Francis Bacon noted "knowledge is power", and counted the invention of gunpowder as an advance of his civilization. Galileo figured out the equations for a falling body because he was paid to do so but they apply directly to gunnery tables. Think ENIAC.
- One of the inventors of a new mathematical notation which is just now being applied to the newest programming languages starved to death as a direct result of his membership in the Nazi party.
- The footnote #141 Kalman 1960 was applied directly to an aerospace defense application, as implemented in integrated circuits
- The new computer languages of the 1950s forward were applied directly to an aerospace defense application
- Elon Musk warns of the application of AI to a new world order. The internet is destroying our political institutions; must we wait any further before designating this as a theater of war?
I am being vague because these statements could be misused against the existing order. I for one wish to preserve the stability of the existing order. --Ancheta Wis (talk | contribs) 07:58, 8 September 2017 (UTC)
See: The Social Construction of Reality. In other words, as social beings, we belong to social systems which can be at war with each other. Can't we rise above the issues that divide us, and join in building up the social systems that unite us? --Ancheta Wis (talk | contribs) 08:15, 8 September 2017 (UTC)
I paraphrase the preface to The Answers of Ernst Von Salomon to the 131 Questions in the Allied Military Government "Fragebogen" (This book has never been out of print in Germany, ever since its first publication) Ernst von Salomon wrote (I paraphrase) "As I wrote my answers, which would determine whether I lived or died, whether I would remain imprisoned or go free, I got the sense of a vast alien intelligence that had not the slightest interest in my own well-being ..." 08:31, 8 September 2017 (UTC)
"magnetic storage" under "stored program"?
[edit]I don't think that "magnetic storage" should be a subsection under "stored program". Magnetic storage isn't necessary for a stored program computer. Bubba73 You talkin' to me? 02:31, 24 September 2017 (UTC)
- There is a qualitative difference, akin to reading from scrolls versus codices versus hypertext. We think and program differently in the respective cases. The techniques scale differently as well. Assembler versus FORTRAN versus the web languages. Maybe this takes more planning for the article. --Ancheta Wis (talk | contribs) 16:02, 24 September 2017 (UTC)
Why no mention of Alonzo Church, who predated Turing?
[edit]Turing is known for articulating the idea of a universal computer, but the first description of a universal computer was the lambda calculus, invented by Alonzo Church (who then became Turing's thesis advisor). Doesn't he belong in the same section with Turing? Briankharvey (talk) 20:50, 16 October 2017 (UTC)
- Cite it and write it! --Wtshymanski (talk) 20:52, 16 October 2017 (UTC)
- This addition is a big step because we would be writing about computer science and abstract machines (such as the lambda calculus) rather than the simple generalization by Turing from paper tape reader and punch. It's a whole new article, history of abstract machines. If the text starts here, it will have to be moved eventually. --Ancheta Wis (talk | contribs) 22:18, 16 October 2017 (UTC)
- @Briankharvey, Other Wikipedians have delved into this history before; see Talk:Post–Turing machine. I'm afraid you are going to need a suitable citation for the claim that Church's work on a universal computer preceded Turing. There were a lot of threads that hit all around the topic. See for example, rough timeline:
- ___________________>Bertrand Russell-->Alonzo Church <-- Turing
- |________________________________>History of computer science, List of pioneers in computer science
- ______________________Max Newman-->Alan Turing
- _______________Emil Post ---------------------------> Martin Davis
- ___________________________________Hao Wang (academic)
- 01:06, 17 October 2017 (UTC)
Amateur computing
[edit]Nothing on amateur computing?
john f 2.26.119.204 (talk) 09:24, 5 December 2017 (UTC)
Importance of NeXT Computer Mention in Article
[edit]A NeXT Computer and its object-oriented development tools and libraries were used by Tim Berners-Lee and Robert Cailliau at CERN to develop the world's first web server software, CERN httpd, and also used to write the first web browser, WorldWideWeb. These facts, along with the close association with Steve Jobs, secure the 68030 NeXT a place in history as one of the most significant computers of all time.[citation needed]
This strikes me as opinion, and not necessarily fitting for a topic on computing hardware. Internet history, definitely, however it is still phrased as opinion. I happen to agree that the NeXT Computer (I believe the NeXTCube) that Tim Berners-Lee used to develop the WWW, and I would add John Carmack's development of Doom on a NeXtStation, are historical, I don't feel this paragraph fits in this article.
Perhaps the history of the WWW, an article on the history of Next, video games, etc. but not in this article.
Communibus locis (talk) 21:42, 18 January 2018 (UTC)
reference
[edit]The article has a book citation to Reconstruction of the Atanasoff-Berry Computer by John Gustafson. I can't find such a book but there is this paper. Is that it? Bubba73 You talkin' to me? 04:22, 8 April 2018 (UTC)
Pictorial Reports on the Computer Field (Computers and Automation, 1957+)
[edit]== Computers and Automation Magazine ==
Pictorial Report on the Computer Field:
- A PICTORIAL INTRODUCTION TO COMPUTERS - 06/1957
- A PICTORIAL MANUAL ON COMPUTERS - 12/1957
- A PICTORIAL MANUAL ON COMPUTERS, Part 2 - 01/1958
- 1958-1966 Pictorial Report on the Computer Field - December issues (195812.pdf, ..., 196612.pdf)
A PICTORIAL INTRODUCTION TO COMPUTERS - [1], pp. 49-56
A PICTORIAL MANUAL ON COMPUTERS - [2], pp. 10-13, 15-17, 19-24, 28, 30, 32
A PICTORIAL MANUAL ON COMPUTERS, Part 2 - [3], pp. 12-17, 20-22, 24, 26-27
1958 Pictorial Report on the Computer Field - [4], pp. 6, 8-10, 12-14, 16-18, 20-21
1959 PICTORIAL REPORT ON THE COMPUTER FIELD - [5], pp. 8-19
1960 Pictorial Report on the Computer Field - [6], pp. 13-32
1961 PICTORIAL REPORT ON THE COMPUTER FIELD - [7], pp. digital 24-36, analog 41-45, I/O devices 60-69; 72-78, 83-88 (Bernoulli disk rotating storage device - p. 62, IBM 1301 - 69, Semiconductor Network Computer - 85)
1962 PICTORIAL REPORT ON THE COMPUTER FIELD - [8], pp. 26-42, I/O / components/others: 67-73 / 74-79/80-82
1963 PICTORIAL REPORT ON THE COMPUTER FIELD - [9], pp. 26-44
1964 PICTORIAL REPORT ON THE COMPUTER FIELD - [10], pp. 28-36, 37-51 (UNIVAC FLUID COMPUTER - air-operated, SDS 92 IC, Fairchild Planar II)
1965 Pictorial Report on the Computer Field - [11], pp. 18-30, 31-38; IC memories, Floating Floor :)
1966 Pictorial Report on the Computer Field - [12], pp. 22---89.25.210.104 (talk) 18:21, 19 June 2018 (UTC)
- It's not clear why these are listed here (or why the list is hidden), but I agree they're interesting to read through. Dicklyon (talk) 20:06, 4 November 2018 (UTC)
- I put it here because I didn't know where it could be placed (for public view). Hidden to save vertical space for people not interested (and because I'm using [very] low screen resolution). --MarMi wiki (talk) 20:27, 12 December 2018 (UTC)
- How about as a bulleted list under Computers And Automation Magazine in External Links? Tom94022 (talk) 00:51, 13 December 2018 (UTC)
- I redacted the list as a new section (see above) to be put before External Links, can it be put there? --MarMi wiki (talk) 00:45, 30 December 2018 (UTC)
- IMO I don't see a new section as appropriate, but a bulleted list under External Links or maybe even under Further Reading seems appropriate. Just my 2 cents, u might want to get other opinions. Tom94022 (talk) 01:00, 30 December 2018 (UTC)
- Added to FR. It may require splitting to pre and post-1960. --MarMi wiki (talk) 21:48, 30 December 2018 (UTC)
- IMO I don't see a new section as appropriate, but a bulleted list under External Links or maybe even under Further Reading seems appropriate. Just my 2 cents, u might want to get other opinions. Tom94022 (talk) 01:00, 30 December 2018 (UTC)
- I redacted the list as a new section (see above) to be put before External Links, can it be put there? --MarMi wiki (talk) 00:45, 30 December 2018 (UTC)
- How about as a bulleted list under Computers And Automation Magazine in External Links? Tom94022 (talk) 00:51, 13 December 2018 (UTC)
- I put it here because I didn't know where it could be placed (for public view). Hidden to save vertical space for people not interested (and because I'm using [very] low screen resolution). --MarMi wiki (talk) 20:27, 12 December 2018 (UTC)
Too much women bias
[edit]Both men and women contributed, but works of women have been exaggerated and sources are inaccurate, based on words of feminist authors rather than neutral . When a job is male specific we never say "the field was primary dominated by men", but if a women in the slightest roles we bring up "women were involved", jobs primarily specified to women we say women were more involved. This is an article on computer hardware not a feminist propaganda article! The source Light, Jennifer S. (July 1999). "When Computers Were Women". Technology and Culture. 40: 455–483. comes from a feminist[citation needed] author rather than a neutral research and is unreliable. Respected Person (talk) 10:16, 14 December 2018 (UTC)
- See WP:BRD —The expectation is that we will Discuss changes to the article on this talk page. --Ancheta Wis (talk | contribs) 16:04, 14 December 2018 (UTC)
- Ms. Light is a well credential historian and there is no evidence that she is a "feminist" author. I suggest we have, so to speak, a "woman bites dog" issue here. IMO the domininace of one gender as the early "computers" is worth including as well as the other objected to feminist "bias". Tom94022 (talk)
- List of pioneers in computer science had a similar problem. Bubba73 You talkin' to me? 18:04, 14 December 2018 (UTC)
- I'm learning that list is roughly alphabetical by surname. But it's not rigorously alphabetized. Might you object if we editors added in the article aliases in the list: for example 73, Bubba; or Lovelace, Ada; or Post, Emily? I tried the experiment and it doesn't seem to break that list. That way we could improve the alphabetic sort in that list, one pioneer at a time. --Ancheta Wis (talk | contribs) 10:35, 15 December 2018 (UTC)
Hardwired?
[edit]The article has "...but the 'program' was hard wired right into the set up, usually in a patch panel". Is it correct to call a patch panel (plug board) hard-wired, since it is easily changed? See this dictionary. Bubba73 You talkin' to me? 03:11, 24 December 2018 (UTC)
- It's a relatively more difficult way to 'program'. ENIAC was set up to solve equations by directing the results from one bank of operations to the next. --Ancheta Wis (talk | contribs) 04:07, 24 December 2018 (UTC)
- It is a lot more difficult, but it is really not "hard wired". Bubba73 You talkin' to me? 04:09, 24 December 2018 (UTC)
- Certainly it was an evolution; the computations on punched cards also were directed by moving cables on plug boards, directing data from one unit to the next in this way. It's in the article. If you wish to rephrase this, feel free. But electronic computation did not spring forth fully formed. The process is still evolving, with Optical components as the next phase.
- In the same sense, FPGAs are another waypoint on the spectrum of 'wiring', relatively less 'hard-coded' than other circuits and more 'hard-wired' than applications software.
- Feel free to improve the text. --Ancheta Wis (talk | contribs) 04:33, 24 December 2018 (UTC)
- It is a lot more difficult, but it is really not "hard wired". Bubba73 You talkin' to me? 04:09, 24 December 2018 (UTC)
- One plug may be easily changed, but programming took weeks (ENIAC#Programming, at least till 1948 ENIAC#Improvements).
- Plug boards required physical connections, so they may be considered as "hard wired" (in "directly connected", "connected by cables" and "controlled by hardware" sense: American, [13], [14]). --MarMi wiki (talk) 23:50, 29 December 2018 (UTC)
Post-1960 (integrated circuit based) - move to 1960s–present?
[edit]I think that most of Post-1960 (integrated circuit based) section should be moved to History of computing hardware (1960s–present). --MarMi wiki (talk) 22:08, 30 December 2018 (UTC)
- I would go further and combine this article's sections 9 and 10 into one new section "History of computing hardware (1960s–present)" with just a summary and a main link to the History of computing hardware (1960s–present) article. Anything in these two sections not in the main article should be moved and the redundant rest this article in deleted. Tom94022 (talk) 22:40, 30 December 2018 (UTC)
American knowledge of Colossus
[edit]It is clear that Americans were intimately involved in the use of Colossu during WWII - see: "Small, Albert W. (December 1944), The Special Fish Report, The American National Archive (NARA) College Campus Washington{{citation}}
: CS1 maint: date and year (link) CS1 maint: location missing publisher (link)". So I shall revert the recent edit. --TedColes (talk) 04:15, 11 January 2019 (UTC)
- Probably a good idea to wedge that ref into the article somewhere. - Snori (talk) 05:27, 11 January 2019 (UTC)
Integrated Circuits
[edit]I would like to discuss this edit:[15]
@Tom94022: <--ping
It seems to me that computers based on integrated circuits were an important intermediate step between computers based upon discrete transistors and computers based upon microprocessors. I this the section should be restored. --Guy Macon (talk) 01:14, 20 January 2019 (UTC)
- I agree that a section on IC-based computers is important for computing history. For those on low budgets, minicomputers (using ICs) were still supreme when people started mucking around with microprocessors. The old "Integrated circuit" section is at permalink. Johnuniq (talk) 01:34, 20 January 2019 (UTC)
- I WP:BRD restored it. --Guy Macon (talk) 06:12, 20 January 2019 (UTC)
- Computers with integrated circuits are basically the whole third generation, in my opinion. Bubba73 You talkin' to me? 06:16, 20 January 2019 (UTC)
- Did you know that they are still available? See [ https://gigatron.io/ ]. --Guy Macon (talk) 06:40, 20 January 2019 (UTC)
- Yes, I've seen a YouTube video about it by the 8-bit guy. Bubba73 You talkin' to me? 07:06, 20 January 2019 (UTC)
- Did you know that they are still available? See [ https://gigatron.io/ ]. --Guy Macon (talk) 06:40, 20 January 2019 (UTC)
Integrated circuit computers never left the article; some of the history of integrated circuits did. The article should be aligned with the traditional four generations of electronic computers; tube, transistor, IC(not micoprocessor) and (monolithic) microprocessor. @Guy Macon:'s edit sort of messed this up lumping three into one which I will restore. As far as the history of the invention of the IC does it really have much to do with this article and is very well coverred in the integrated circuit article. I will leave the history in until we hear from other editors. Tom94022 (talk) 07:04, 20 January 2019 (UTC)
- There's no question that the IC era was important, but the story is more granular than 1,2,3,4 generations. Computer architectures can be independent of the types of logic elements used to implement them. Thus the PDP-8 started with discrete transistors, went through stages of IC designs and ended as a microprocessor. Or going back further, there's the IBM 709 - 7090 transition from vacuum tube to transistor. (Ada Lovelace described the concept clearly in her writing about the Analytical Engine.) A major milestone was the development of standardized IC families, especially TTL. Early 16-bit minicomputers, like the PDP-11 or the DG Nova used the basic 7400-series integrated circuits, with only a few logic elements per chip, what became known as SSI, single scale integration, IIRC. As the technology improved, single chips did more complex operations, such as encoders and decoders or multi bit bus interfaces, medium scale integration. I remember the 74LS245 octal bus transceiver introduction. Previously one needed separate bus driver and bus receiver chips. The '245 cut the chip count for a bus interface in half and was an instant success, so much so that supplies were tight, prices rose, and a black market developed, with thieves stealing supplies of chips from warehouses. The next stage was so called large scale integration, LSI with, for example, bit slice chips like the 1975 AMD Am2900 series that implemented all the circuitry for a CPU four bits at a time. Everyone knew that eventually an entire CPU would fit on a chip, but early designs were very limited in what they could do so machines based on the LSI designs persisted for several years as microprocessors improved.--agr (talk) 12:40, 20 January 2019 (UTC)
- Perhaps we need to split the history into two sections? One could cover the classes of computer -- programmable looms, mechanical adding machines, analog computers, programmable calculators and 10-key adding machines, mainframes, supercomputers, minicomputers, PLCs, personal computers, smart phones, etc. The other could cover technologies -- mechanical (plus hydraulic?), relay, vacuum tube, transistor, LSI, VLSI, Microcontroller, PLD, SoC, etc. --Guy Macon (talk) 17:06, 20 January 2019 (UTC)
- There is a RS for generations and within generations, classes: see Bell’s Law for the Birth and Death of Computer Classes: A theory of the Computer’s Evolution. So adding the generations is just a start to improving this article and they should not have been reverted out. Tom94022 (talk) 21:17, 20 January 2019 (UTC)
The section History_of_computing_hardware#Integrated_circuits really doesn't say anything about how/when ICs got into computers. The Apollo Guidance Computer should be mentioned as one of the first; it used only one type of small-scale IC (double 3-input NOR). Probably some of you know other early computers based on ICs (is that in what was removed?). Dicklyon (talk) 20:11, 20 January 2019 (UTC)
- What was removed was some, not all detail history of ICs as not particularly relevant to this article and a little bit of the history of third generation computers was added but then reverted out. Agreed the section History_of_computing_hardware#Integrated_circuits needs improvement and I'll take a crack at it there. There is a Draft:List_of_integrated_circuit_computers article that has some details that should be moved into History_of_computing_hardware#Integrated_circuits. Tom94022 (talk) 21:17, 20 January 2019 (UTC)
Per WP:TALKDONTREVERT and WP:BRD I reverted the following rather major removal of sourced material, only to face an editor who chooses to re-revert rather than discuss.[16][17][18] This is the same editor who tried tto deleted a large chunk of material that we discussed in the section above.[19][20]
Before I go any further, I would like to bring this up for discussion. Should that material be deleted or retained? --Guy Macon (talk) 22:22, 20 January 2019 (UTC)
- What major removal of material are your referring to! A few sentances in a long article by my count. And it was discussed above. Lets be clear that this is a not just BRD cycles since there were substantive changes made in most of the edits and so far Macon has provided no justification for his reversions other than some deleted material is referenced. I agree there is a legitimate question as to how much of the deleted material should be added back to the article. Here is what was deleted shown in strikethrough:
The idea of the integrated circuit was conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952:[141]
With the advent of the transistor and the work on semi-conductors generally, it now seems possible to envisage electronic equipment in a solid block with no connecting wires.[142] The block may consist of layers of insulating, conducting, rectifying and amplifying materials, the electronic functions being connected directly by cutting out areas of the various layers”.The first practical ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor.[143] Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[144]In his patent application of 6 February 1959, Kilby described his new device as “a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated.”[145] The first customer for the invention was the US Air Force.[146]Noyce also came up with his own idea of an integrated circuit half a year later than Kilby.[147] His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium.
- My stated opinion is that this IC history is not particularly relevant to this computer history article (i.e. undue in this context) and is redundant in that it is well coverred in the IC article. Macon's only discussion is that these sentances are referenced. If other editors think this deleted material should be added back to the one section that's fine.
- Furthermore, my second edit[21] was not a reversion at all but a substantial revision including new material, new organization and the above deletion. Macon twice reverted all without discussion which is a major violation of WP:BRD. If other editors want to edit that material fine but I suggest it is more appropriate for Macon to discuss rather than edit or revert. Tom94022 (talk) 00:27, 21 January 2019 (UTC)
- The basis of this article was User:Michael Hardy's statement that the history of computing is bigger than the history of the hardware. I went back to the 2008 edit history when the microprocessor material was added to this article, such as the Intel 8742. (My motivation was the illustration of the interconnects to the CPU.) At that time, the Post-1960 article had nothing comparable. Indeed, at that time I had to add in material to the integrated circuit article. As the highest-level article on the history of computing, this article did not have to worry about the Post-IC era. The split is artificial. That is why it can include material about the latest advances in computing. --Ancheta Wis (talk | contribs) 03:06, 21 January 2019 (UTC)
- I see that some of my citations were moved to the post-1960 article and removed from this one. --Ancheta Wis (talk | contribs) 10:17, 21 January 2019 (UTC)
- It's important that we remember that microprocessor designs are not the be-all and end-all. For example, the electromagnetic pulses in a nuclear war would knock out those designs, and we would be back to vacuum tubes and other radiation-hardened circuit designs for computing. --Ancheta Wis (talk | contribs) 11:06, 21 January 2019 (UTC)
- A severe geomagnetic storm could also knock them out, something for which there is little planning being done.--TedColes (talk) 14:39, 21 January 2019 (UTC)
Its not clear why we have two articles that overlap, History_of_computing_hardware and History_of_computing_hardware_(1960s–present) but given that's the situation it seems appropriate to me that most of the information about the third and fourth generations of computer hardware belong to the second article while this article has summary sections pointing to the sections in the second article as the main article. To that line, I intend to move much of the microprocessor material to the main article. Comments?
I do think the "generations" belong in both articles and have so edited them into both articles. For this reason I reverted Ancheta Wis change of the section title. I have no objection to a new section on Current Commputers but shouldn't be in History_of_computing_hardware_(1960s–present)? It should include Microprocessor Computers but likely includes other computer devices. Would need to find a reference to add it in either article but I am sure there are some. Tom94022 (talk) 17:45, 21 January 2019 (UTC)
- Indeed, it's all confusing, and I don't totally follow in this discussion who is pushing in which direction. But I think I agree with you that we don't need to discuss the many inventors of the integrated circuit in these computer articles. We just need to discuss how IC technology was adopted into computer hardware. Dicklyon (talk) 18:03, 21 January 2019 (UTC)
Reference 118
[edit]Reference 118 is a link to a patent application. It is used at the end of a sentence saying that magnetic core was dominant until the mid-1970s. This is not in the source. Bubba73 You talkin' to me? 03:00, 22 January 2019 (UTC)
- Computer History Museum says magnetic core lasted until 1980, supplanted by DRAM devices: 4K DRAM devices 1973, 16K DRAM 1976, 64K DRAM 1979[1] [2] --Ancheta Wis (talk | contribs) 04:39, 22 January 2019 (UTC)
References
- My point is that reference 118 doesn't support the sentence to which it is attached. Bubba73 You talkin' to me? 05:13, 22 January 2019 (UTC)
- We can augment the reference, can't we. --Ancheta Wis (talk | contribs) 05:16, 22 January 2019 (UTC)
- My point is that reference 118 doesn't support the sentence to which it is attached. Bubba73 You talkin' to me? 05:13, 22 January 2019 (UTC)
- Yes, please do. Bubba73 You talkin' to me? 05:47, 22 January 2019 (UTC)
- Fixed. Tom94022 (talk) 07:02, 22 January 2019 (UTC)
Moving beyond microprocessors
[edit]The scope of computing has moved beyond microprocessors; multiple governments beyond the US government are seeking quantum computing as a matter of national security, including Canada, Australia, the Netherlands, the United Kingdom, the European Union, Singapore, Russia, North Korea and Japan.[1] What this means for the article is a change to the post 1960 section name, beyond the microprocessor.[1] --Ancheta Wis (talk | contribs) 07:02, 23 February 2019 (UTC)
- @Ancheta Wis: Its not clear why this history is bifurcated at 1960, but given Quantum computing is still in its infancy it probably doesn't deserve creating another break in the history of computing harware. I suggest at most it currently deserves is a section in History_of_computing_hardware_(1960s–present) Why don't u do it? Tom94022 (talk) 18:57, 23 February 2019 (UTC)
- That article describes devices that got stuck in the von Neumann bottleneck after 50 years (2010). Those kinds of computers have ceased to get faster. --Ancheta Wis (talk | contribs) 02:18, 26 February 2021 (UTC)
- In February 2021, photonic techniques have generated "ultrafast generation of hundreds of random bit streams in parallel with a single laser diode". Kyungduk Kim, Stefan Bittner, Yongquan Zeng, Stefano Guazzotti, Ortwin Hess, Qi Jie Wang, Hui Cao. "Massively parallel ultrafast random bit generation with a chip-scale laser" Science (26 Feb 2021) 371, (6532), pp. 948-952 DOI: 10.1126/science.abc2666 . The 250 terabyte-per-second rate is hundreds of times faster than microprocessor-based methods of pseudo-random number generation (repeatable data), while laser-based random (non-repeatable data) number generation uses spontaneous emission, a physical process for random number generation.
- The device can already generate blocks of random (nonrepeatable) data that exceeds the size of the largest library in the world (Library of Congress in 12 seconds.[1]
References
- It should be noted that the laser (Light Amplification by Spontaneous Emission of Radiation) was first built in 1960 (Sixty years ago, when the integrated circuit was itself demonstrated). This shift to photonic techniques that already eclipse electronic devices can only increase in the future. The technique has immediate practical and financial applications (e.g. blockchain). --Ancheta Wis (talk | contribs) 02:08, 26 February 2021 (UTC)
"Moderne"
[edit]Per Wikipedia:Disambiguation: There are three important aspects to disambiguation: "Making the links for ambiguous terms point to the correct article title. For example, an editor of an astronomy article may have created a link to Mercury, and this should be corrected to point to Mercury (planet)."
The term Modern is vague and meaningless. Modern history covers the period from the 16th to the 21st century. Modernity is also used for the "socio-cultural norms" of the world prior to World War II, and is associated with Modernism as an art movement (late 19th century to early 20th century).
Meanwhile contemporary history refers to the present time period. Dimadick (talk) 19:24, 5 March 2019 (UTC)
- IMO the ordinary dictionary meaning of the adjective "modern" is sufficient and unambiguous in this context so no link is needed to either Modern history or Contemporary history nor is any required under the quoted aspect of Wikipedia:Disambiguation above nor under any other aspect of the policy. From a stylistic perspective "modern history" reads better to my eyes than "contemporary history" as support I would note a Google search has the terms in a ratio of about 16M/4M. As a style issue then MOS:VAR would seem to require the existing style be retained. Tom94022 (talk) 20:17, 5 March 2019 (UTC)
- We are not a dictionary, and this is a history article. And the Wikipedia:Manual of Style specifically mentions cases where "the existing style is problematic". Dimadick (talk) 15:24, 7 March 2019 (UTC)
Does this article need MOSFET history?
[edit]I think not! A recent edit reverted without relevant discussion the removal of material that is not particularly relevant to this article and well covered in the linked article and elsewhere. I'm going to revert it again and hopefully there will be some discussion here and not edit warring. Tom94022 (talk) 01:24, 13 September 2019 (UTC)
Wikipedia:Copying within Wikipedia
[edit]- What needs to be acknowledged is sourcing. The work had a source, and a volunteer who did the work. The licensing depends on this. At the very least, the Wikipedia article that served as the impetus ought to be noted, in order for the work to be freely available. User:Michael Hardy formed this article and a family of others with his frank admission of what he realized he did not know. Wikipedia:Copying within Wikipedia was formulated in 2009 and there has been copying of this History of computing hardware. This article is very old (in Wikipedia terms), but those of us who wrote it still know its story, and we gladly shared our knowledge. Some of us are still around. The more recent articles need to acknowledge their sources. (See the links) --Ancheta Wis (talk | contribs) 06:57, 13 September 2019 (UTC)
- Can you please spell out the problem. Are you saying that someone moved text from this article to another article without the attribution required by WP:CWW? What other article? Johnuniq (talk) 07:06, 13 September 2019 (UTC)
- It's a synchronization problem. We were writing articles all these years, and a culture arose, but some points got omitted during this time, without specifically acknowledging WP sources. So the problem arose gradually. I'm pretty sure it happened in good faith. --Ancheta Wis (talk | contribs) 07:22, 13 September 2019 (UTC)
- Lots of text is added and later removed. That is not a problem. The question now is what text should be in this article, and what text should be in a related article. Johnuniq (talk) 07:36, 13 September 2019 (UTC)
- Per Wikipedia:Copying within Wikipedia, "state that content was copied from that source" in the summary, not just the article. A simple acknowledgement of "Who got what, and when." --Ancheta Wis (talk | contribs) 07:47, 13 September 2019 (UTC)
- I linked to WP:CWW above, and it's why I asked for the problem to be spelled out. Johnuniq (talk) 07:50, 13 September 2019 (UTC)
- The links resolve to the same article. The problem is credit for licensing. --Ancheta Wis (talk | contribs) 07:57, 13 September 2019 (UTC)
- I linked to WP:CWW above, and it's why I asked for the problem to be spelled out. Johnuniq (talk) 07:50, 13 September 2019 (UTC)
- Per Wikipedia:Copying within Wikipedia, "state that content was copied from that source" in the summary, not just the article. A simple acknowledgement of "Who got what, and when." --Ancheta Wis (talk | contribs) 07:47, 13 September 2019 (UTC)
- Lots of text is added and later removed. That is not a problem. The question now is what text should be in this article, and what text should be in a related article. Johnuniq (talk) 07:36, 13 September 2019 (UTC)
- It's a synchronization problem. We were writing articles all these years, and a culture arose, but some points got omitted during this time, without specifically acknowledging WP sources. So the problem arose gradually. I'm pretty sure it happened in good faith. --Ancheta Wis (talk | contribs) 07:22, 13 September 2019 (UTC)
- Can you please spell out the problem. Are you saying that someone moved text from this article to another article without the attribution required by WP:CWW? What other article? Johnuniq (talk) 07:06, 13 September 2019 (UTC)
The above subsection is not particularly relevant to the question raised. It doesn't matter whether irrelevant material is copied from within Wikipedia or obtained from reliable sources it is still not relevant to this article. Similarly, it is not particularly relevant that the disputed material was in the article at some distant time. Nor does the age of the article or prior authors particularly matter. History of MOSFETs just not deserve a place in this article. I suppose if a reliable source can be found perhaps a single sentence might be added; something along the lines of, "Modern microprocessors are built upon MOSFET technology." Tom94022 (talk) 20:10, 14 September 2019 (UTC)
Earliest?
[edit]See Talk:Analog computer#Earliest? --Guy Macon (talk) 16:23, 6 October 2019 (UTC)
- Just as the Antikythera mechanism was to calculate the date of the sacred olympic games, the south-pointing chariot was for the emperor of China, whose only duty was to face south; the Mandate of Heaven to justly govern then determined what army would win in battle. This was the state of military technology (think a hail of bronze-tipped bolts fired from crossbows) during the Warring States period. (See Science and Civilisation in China, I read this in the 1970s, so it would have been in one of the Volumes 1 to 4 -- there are now over 27 such books.) --Ancheta Wis (talk | contribs) 18:20, 6 October 2019 (UTC)
Entire Class of Electro/Mechanical Computers MISSING
[edit]Entire Class of Electro/Mechanical Computers MISSING
There were thousands of types of mechanical & electro-mechanical load/store computers used for hundreds of years.
Most of these used a sled, cart, or feeder robot that would take a sequence control (such as an index number, turns, box or document number, page or slot, etc) and go fetch or access something.
These, by storing values, performed almost any sequence of computations from any library of punched tape, reels, feed tape, mechanical route cards, etc...
There were huge tabulation and computational facilities supporting these little bots that followed routes on rails or channels to leads anywhere. — Preceding unsigned comment added by 172.58.187.157 (talk) 19:07, 18 December 2019 (UTC)
- You probably had tabulating machines in mind, but they weren't computers. MarMi wiki (talk) 23:37, 18 December 2019 (UTC)
- The "sled, cart, or feeder robot that would take a sequence control (such as an index number, turns, box or document number, page or slot, etc) and go fetch or access something" did not exist "for hundreds of years". The closest we ever had to that was the pneumatic tube transport system, but those were fixed path, not programmable. --Guy Macon (talk) 02:36, 19 December 2019 (UTC)
- Or jukeboxes? Dicklyon (talk) 03:52, 19 December 2019 (UTC)
- Good catch! I hadn't though of jukeboxes, but they certainly "fetched" something. Not a computer, of course, but modern jukeboxes have replaced the pure mechanical system with one controlled by a computer.
- I find it fascinating that Rock-ola[22] is still making and selling jukeboxes 30 years after the introduction of the MP3 player made them obsolete. You just wait; slide rules (also not computers) are sure to make a comeback Real Soon Now. (Get off my lawn, you damn kids!) --Guy Macon (talk) 15:46, 19 December 2019 (UTC)
- Or jukeboxes? Dicklyon (talk) 03:52, 19 December 2019 (UTC)
- The "sled, cart, or feeder robot that would take a sequence control (such as an index number, turns, box or document number, page or slot, etc) and go fetch or access something" did not exist "for hundreds of years". The closest we ever had to that was the pneumatic tube transport system, but those were fixed path, not programmable. --Guy Macon (talk) 02:36, 19 December 2019 (UTC)
RECOMP - early computer?
[edit]The 1962 book Computers: the machines we think with, by D. S. Halacy, Jr, pg. 49, says that the ENIAC was followed by BINAC, MANIAC, JOHNNIAC, UNIVAC, RECOMP, STRETCH, and LARC. I couldn't find anything about RECOMP, but there is Autonetics Recomp II. Was there a computer named RECOMP, before the RECOMP II? Bubba73 You talkin' to me? 21:02, 19 August 2020 (UTC)
- " Vintage Computer 1958 Transistor Logic - Autonetics Recomp
- "Recently I was working in our computer museum warehouse and brought out a computer made in 1958 with transistor logic. It is an Autonetics Recomp 501 digital computer. This looks like an attempt by Autonetics to make a commercial computer from the cards and designs used in the Minuteman 1 ICBM guidance computer. Not many of these computers were made and this one is serial number 003. I located one additional Recomp computer in a California museum. These computers had a very limited success commercially however it sure is an interesting piece of hardware - they called it a portable office computer. You will enjoy the photos. Update 10-26-14 I think the Recomp 501 was first then the Minuteman 1 computer."[23]
- https://2.bp.blogspot.com/-2OQ-ah4mR8w/VC-C3c8-fhI/AAAAAAAAFKw/jkTU17oPASE/s1600/DSCF6732.JPG
- https://4.bp.blogspot.com/-qtV9MOZYM7s/VQ3kUOOwNtI/AAAAAAAAF7o/M0g9qqp8aOY/s1600/Autonetics_Recomp%2Bopen%2Bwings%2Bout%2B7-25-2013_14.JPG
- https://1.bp.blogspot.com/-FAxpCqwNp80/VC4KJRIPlxI/AAAAAAAAFJc/twRgGkDv7J4/s1600/500004715-05-01.jpg
- Also listed at http://ucan.us/doyetech/word.htm --Guy Macon (talk) 04:32, 20 August 2020 (UTC)
- Thanks for the photos - it looks like a mini Bendix G-15. (I got a plug board from a G-15 a week or two ago!) But that description matches our article on the Recomp II. Was there an earlier Recomp? Or is the II in the article wrong? And the advertisement for the 1958 computer doesn't have the "II". Bubba73 You talkin' to me? 04:56, 20 August 2020 (UTC)
- Well, the 1961 BRL report has a Recomp I, page 0819. Bubba73 You talkin' to me? 05:00, 20 August 2020 (UTC)
- Photo linked to on that page: http://www.ed-thelen.org/comp-hist/BRL61-0820.jpg --Guy Macon (talk) 08:31, 20 August 2020 (UTC)
Did some more searching: April 1, 1957 "Recomp I, a new portable, high-speed, completely transistorized digital computer" https://www.americanradiohistory.com/hd2/IDX-Site-Technical/Engineering-General/Archive-Electronics-IDX/IDX/50s/57/Electronics-1957-04-OCR-Page-0138.pdf
Theory #1: The Recomp I was introduced in 1957, the Recomp II was introduced in 1958, and they were sill selling Recomp Is in 1958.
Theory #2: They called the Recomp I "Recomp" until they decided to build a Recomp II, and at that point started calling the first Recomp a Recomp I.
--Guy Macon (talk) 06:12, 20 August 2020 (UTC)
First programmable analog computer?
[edit]Why do I keep seeing this same exact line "The castle clock, a hydropowered mechanical astronomical clock invented by Al-Jazari in 1206, was the first programmable analog computer.[10][11][12]" posted on all the major computing history Wikipedia articles? For starters, I believe the claim is a bit sensationalized, as the word "programmable" is being used very loosely here. The term programmable is usually meant in the context of being able to provide instructions to a machine so that the machine can adjust its operations accordingly. In this scenario for Al-Jazari's clock (also very loosely associated with a computer, but it performs a computation of sorts, namely, keeping time and such, so I will grant that I suppose), the clock had to be manually recalibrated. Does this qualify as programmable? In addition to this, the actual cited source isn't even correct. The episode in question of Ancient Discoveries of the History Channel is Series 3 Episode 9, and the episode itself (available on YouTube) doesn't even support the claim that Al-Jazari's clock was the first programmable analog computer. The episode actually makes an even stranger claim: that Al-Jazari's clock was a "super computer". I also looked through source 11 and didn't find the claim supported on the page given. What is going on here? 2601:82:200:8B20:0:0:0:3C04 (talk) 01:24, 16 June 2022 (UTC)
- It seems clear that the castle clock is programable in adjusting the flow for unequal hours of a day. Whether it is earliest remains to be shown. Tom94022 (talk) 02:58, 16 June 2022 (UTC)
Sequencing of Turing and Von Neumann
[edit]@TedColes: My edit [24] which you reverted, corrected the suggestion in the article that Turing's designs was independent of the work done by Mauchly and Eckert at the University of Pennsylvania, as reported by John von Neumann in his "First Draft of a Report on the EDVAC." My correction gave references which you removed.
The version prior to my edits and the current version after being reverted says:
In 1945 Turing joined the National Physical Laboratory and began his work on developing an electronic stored-program digital computer. His 1945 report 'Proposed Electronic Calculator' was the first specification for such a device.
Meanwhile, John von Neumann at the Moore School of Electrical Engineering, University of Pennsylvania, circulated his First Draft of a Report on the EDVAC in 1945. Although substantially similar to Turing's design and containing comparatively little engineering detail, the computer architecture it outlined became known as the "von Neumann architecture".
However Turing himself states on page 3 of his 'Proposed Electronic Calculator': [1]
"The present report gives a fairly complete account of the proposed calculator. It is recomended however that it be read in conjunction with J. von Neumann's 'Report on the EDVAC',"
Turing indeed wrote a more full worked out design, but he does not claim to have written the "first specification for such a device." My edits which correct the timing, without denigrating Turing's contribution in any way, should be restored.--agr (talk) 19:57, 27 August 2023 (UTC) agr (talk) 19:57, 27 August 2023 (UTC)
- I agree with agr that sequencing JvN before Turing is supported by the historical record and therefore his edit should be restored Tom94022 (talk) 23:18, 27 August 2023 (UTC)
- It can be difficult to present developments that happened in parallel in a linear medium such as a Wikipedia article. Turing’s 1936 paper is, however, widely accepted as being seminal and there was a lot of transatlantic exchanging and sharing of ideas. As Copeland puts it in his 2004 book The Essential Turing
The idea of a universal stored-programme computing machine was promulgated in the USA by von Neumann and in the UK by [Max] Newman, the two mathematicians who, along with Turing himself, were by and large responsible for placing Turing’s abstract universal machine into the hands of electronic engineers.[2]
- I reverted the edit by ArnoldReinhold (talk · contribs) because it left nothing immediately under the heading "Theory". I think that it is important to recognise that parallel developments took place on the two sides of the Atlantic. It should be possible to arrive at a consensus on an acceptable re-sequencing of the material.--TedColes (talk)
- It's important to recognize that WWII secrecy played a role in the development of the ideas: Gordon Bell and Allen Newell's 1971 Computer Structures page xiii acknowledges 64 names (including M.V. Wilkes but skipping over Turing). When even Bell and Newell fail to cite, it's not through lack of diligence, but likely by government directive at the organizational level (which got superseded when economics and electronics finally showed the importance of computing hardware in everyday life). The evidence is that information flowed asymmetrically across the Atlantic. --Ancheta Wis (talk | contribs) 10:46, 28 August 2023 (UTC)
- Turing's 1936 paper already is the first sentence of the section. The rest of the first paragraph seems out of context since it jumps to his 1945 work placing it ahead of JvN and apparently out of context. The simple solution seems to be to move the rest of the first paragraph, something like this:
- It's important to recognize that WWII secrecy played a role in the development of the ideas: Gordon Bell and Allen Newell's 1971 Computer Structures page xiii acknowledges 64 names (including M.V. Wilkes but skipping over Turing). When even Bell and Newell fail to cite, it's not through lack of diligence, but likely by government directive at the organizational level (which got superseded when economics and electronics finally showed the importance of computing hardware in everyday life). The evidence is that information flowed asymmetrically across the Atlantic. --Ancheta Wis (talk | contribs) 10:46, 28 August 2023 (UTC)
- It can be difficult to present developments that happened in parallel in a linear medium such as a Wikipedia article. Turing’s 1936 paper is, however, widely accepted as being seminal and there was a lot of transatlantic exchanging and sharing of ideas. As Copeland puts it in his 2004 book The Essential Turing
The theoretical basis for the stored-program computer had been proposed by Alan Turing in his 1936 paper.
Meanwhile,John von Neumann at the Moore School of Electrical Engineering, University of Pennsylvania, circulated his First Draft of a Report on the EDVAC in 1945. Although substantially similar to Turing's design and containing comparatively little engineering detail, the computer architecture it outlined became known as the "von Neumann architecture". Turing presented a more detailed paper to the National Physical Laboratory (NPL) Executive Committee in 1946, giving the first reasonably complete design of a stored-program computer, a device he called the Automatic Computing Engine (ACE). However, the better-known EDVAC design of John von Neumann, who knew of Turing's theoretical work, received more publicity, despite its incomplete nature and questionable lack of attribution of the sources of some of the ideas.[54]In 1945 Turing joined the National Physical Laboratory and began his work on developing an electronic stored-program digital computer. Turing thought that the speed an ...
- Tom94022 (talk) 23:38, 28 August 2023 (UTC)
- I appreciate the way the topic is presented in light of @TedColes latest edit. Turing's undeniable role in laying the theoretical foundation of the stored program computer with his Turing machine is evident and well established in the article. Meanwhile, the Moore School, von Neumann, Wilkes and others worked assiduously to translate theory into practical reality. The first stored program computers emerged as products of brilliant teamwork, where all actors, including Turing, acknowledged, cited and respected each other's contributions. I think it's important to recognize that no major figure should be overshadowed by another, as they all played significant roles in shaping the course of the stored program development. Damien.b (talk) 11:28, 29 August 2023 (UTC)
- Tom94022 (talk) 23:38, 28 August 2023 (UTC)
- I have re-written the section and hope that I have represented as many as possible of the different ideas presented here.--TedColes (talk) 12:12, 29 August 2023 (UTC)
- Section references
- ^ Alan Turing (1945). Proposed Electronic Calculator (PDF). Retrieved August 24, 2023.
- ^ Copeland, B. Jack (2004). The The Essential Turing. Oxford University Press. p. 16. ISBN 0-19-825080-0.
Commercial Computer
[edit]Why does the article claim that 'The first commercial computer was the Ferranti Mark 1, built by Ferranti and delivered to the University of Manchester in February 1951. ' when the Z4 was already rented to ETH and in operation there in 1950? This in my view clearly makes the Z4 the first commercial computer. The Z4 article even says (with references) that 'In 1950/1951, the Z4 was the only working digital computer in Central Europe, and the second digital computer in the world to be sold or loaned,[1]: 981 beating the Ferranti Mark 1 by five months and the UNIVAC I by ten months, but in turn being beaten by the BINAC (although that never worked at the customer's site[19]).' Claiming a computer that never really worked the 'firs commercial computer' seems rather misleading, so the first computer working for money is clearly the Zuse Z4. --85.169.148.50 (talk) 22:36, 10 March 2024 (UTC)
New section
[edit]@Nathansanni: The new sub section Impact of the Industrial Revolution on Computing Hardware being added in section 1, "Early devices" seems both out of place and redundant to material already in section 2, First proposed general-purpose computing device. Please consider moving it. Tom94022 (talk) 18:30, 30 October 2024 (UTC)
- Hey, just merged both of these articles so they would not seem redundant. Nathansanni (talk) 19:07, 30 October 2024 (UTC)
- Wikipedia former featured articles
- Featured articles that have appeared on the main page
- Featured articles that have appeared on the main page once
- B-Class level-4 vital articles
- Wikipedia level-4 vital articles in History
- B-Class vital articles in History
- B-Class Computing articles
- Top-importance Computing articles
- B-Class Early computers articles
- High-importance Early computers articles
- B-Class Early computers articles of High-importance
- B-Class Computer hardware articles
- Unknown-importance Computer hardware articles
- B-Class Computer hardware articles of Unknown-importance
- All Computing articles
- B-Class history articles
- High-importance history articles
- WikiProject History articles
- B-Class history of science articles
- High-importance history of science articles
- WikiProject History of Science articles