Don's Pages and my Music

Tuesday, February 22, 2011

Watson (artificial intelligence software) - Wikipedia, the free encyclopedia

Watson (artificial intelligence software)

From Wikipedia, the free encyclopedia
Jump to: navigation, search

IBM and the Jeopardy Challenge


Link to Video...
http://www.youtube.com/watch?v=FC3IryWr4c8

"IBM Watson" redirects here. For the laboratory, see: Thomas J. Watson Research Center.
Watson's avatar was inspired by the IBM "smarter planet" logo.[1]

Watson is an artificial intelligence computer system capable of answering questions posed in natural language,[2] developed in IBM's DeepQA project by a research team led by principal investigator David Ferrucci. Watson was named for IBM's first president, Thomas J. Watson.[3][4] The program operates on POWER7 processor-based systems.

In 2011, as a test of its abilities, Watson competed on the quiz show Jeopardy!, in the show's first and only human-versus-machine matchup.[3] In a two-game, combined-point match, broadcast in three Jeopardy! episodes February 14–16, Watson bested Brad Rutter, the biggest all-time money winner on Jeopardy!, and Ken Jennings, the record holder for the longest championship streak.[5][6] Watson received the first prize of $1 million, while Ken Jennings and Brad Rutter received $300,000 and $200,000, respectively. Jennings and Rutter pledged to donate half their winnings to charity, while IBM divided Watson's winnings between two charities.[7]

In the match, all contestants, including Watson, had to wait until host Alex Trebek spoke each clue entirely, then a light was lit as a signal; the first to activate their buzzer button won the chance to respond. Although Watson suffers from cognitive deficiencies when analyzing the clue's contexts, it typically activated its button faster than its opponents. Watson also only had trouble responding to a few categories, such as short clues containing only a few words. For each clue, Watson's three most probable responses were displayed by the television screen. Watson had access to 200 million pages of structured and unstructured content consuming four terabytes of hard disk storage,[8] including the full text of Wikipedia.[9] Watson was not connected to the Internet during the game.[10][11]

Contents

[hide]

[edit] Overview

Watson is a question answering (QA) computing system built by IBM.[2] IBM describes it as "an application of advanced Natural Language Processing, Information Retrieval, Knowledge Representation and Reasoning, and Machine Learning technologies to the field of open domain question answering" which is "built on IBM's DeepQA technology for hypothesis generation, massive evidence gathering, analysis, and scoring."[2] With regards to Watson's detailed constructions, IBM also states:

Watson is a workload optimized system designed for complex analytics, made possible by integrating massively parallel POWER7 processors and the IBM DeepQA software to answer Jeopardy! questions in under three seconds. Watson is made up of a cluster of ninety IBM Power 750 servers (plus additional I/O, network and cluster controller nodes in 10 racks) with a total of 2880 POWER7 processor cores and 16 Terabytes of RAM. Each Power 750 server uses a 3.5 GHz POWER7 eight core processor, with four threads per core. The POWER7 processor's massively parallel processing capability is an ideal match for Watson's IBM DeepQA software which is embarrassingly parallel (that is a workload that executes multiple threads in parallel).[12]

Watson's programming was written in both Java and C++ and uses Apache Hadoop distributed file system, Apache UIMA (Unstructured Information Management Architecture) framework, IBM’s DeepQA software and SUSE Linux Enterprise Server 11 operating system.[8][13][14] According to John Rennie, Watson can process 500 gigabytes, or the equivalent of a million books per second.[15] IBM's master inventor and senior consultant Tony Pearson estimated the Watson's hardware cost is about $3 million.[16]

While primarily an IBM effort, the development team includes faculty and students from Carnegie Mellon University, University of Massachusetts Amherst, University of Southern California/Information Sciences Institute, University of Texas at Austin, Massachusetts Institute of Technology, University of Trento, and Rensselaer Polytechnic Institute.[17]

[edit] Operation

The computer's techniques for unraveling Jeopardy! clues sounded just like mine. That machine zeroes in on key words in a clue, then combs its memory (in Watson's case, a 15-terabyte data bank of human knowledge) for clusters of associations with those words. It rigorously checks the top hits against all the contextual information it can muster: the category name; the kind of answer being sought; the time, place, and gender hinted at in the clue; and so on. And when it feels "sure" enough, it decides to buzz. This is all an instant, intuitive process for a human Jeopardy! player, but I felt convinced that under the hood my brain was doing more or less the same thing.

—Ken Jennings[18]

When playing Jeopardy!, all players, including Watson, had to wait until the host spoke each clue entirely, then a light was lit as a "ready" signal; the first to activate their buzzer button won the chance to respond.[11][19] Watson received the clues as electronic texts at the same moment they were made visible to the human players.[11] It would then parse the clues into different keywords and sentence fragments in order to find related phrases in terms of statistics.[11] Watson's main innovation was not in the creation of new algorithm for this operation, but rather its ability to quickly execute thousands of proven language analysis algorithms simultaneously to find the correct answer.[11][20] The more algorithms that find the same answer independently, the more likely Watson is to be correct.[11] Once Watson has a small number of potential solutions, it is able to check against its database to ascertain if the solution makes sense.[11] In a sequence of 20 mock games, human participants were able to use the average six to seven seconds that needed to hear the clue and decide whether to signal for responding.[11] During that time, Watson also has to evaluate the response and determine if it is sufficiently confident in the result to signal.[11] Part of the system used to win the Jeopardy! contest was the electronic circuitry that received the "ready" signal and then examined whether Watson's confidence level was great enough to activate the buzzer. Given the speed of this circuitry compared to the speed of human reaction times, Watson's reaction time was faster than the human contestants, except when the human successfully anticipated (instead of reacted to) the ready signal.[21] After signaling, Watson speaks with an electronic voice and gives the responses in Jeopardy!'s question format.[11]

The IBM team provided Watson with millions of documents, including dictionaries, encyclopedias, and other reference material that it could use to build its knowledge.[11] Although Watson was not connected to the Internet during the game,[22] it contained 200 million pages of structured and unstructured content consuming four terabytes of disk storage,[8] including the full text of Wikipedia.[9] According to Rennie, the content were stored in Watson's RAM for the game because data stored on hard drives are too slow to access.[15]

According to an interview on WFDU-FM on February 14, 2011, the basis for Watson's synthesized voice was provided by actor/audiobook narrator Jeff Woodman, via a 2004 IBM text-to-speech program.[23][verification needed]

[edit] Comparison with human players

Because Watson's basic working principle is to parse keywords in a clue while searching for related terms as responses, the system offers several strengths and weaknesses when compared with a human Jeopardy! player.[24] It has been noted that Watson has deficiencies in understanding the contexts of the clues, while human players usually generate responses faster than Watson, especially on short clues.[11] Unlike a human player, Watson's programming also prevents it from buzzing in before it can be sure of its response—a popular tactic in Jeopardy!.[11] Watson offers consistently better reaction time on the buzzer once it has generated a response, and it is also immune to psychological tactics.[11][25]

[edit] Development history

One possible genesis of the Watson project has been told anecdotally by those involved in it as beginning in 2004. Since Deep Blue's victory over Gary Kasparov in chess in 1997, IBM had been on the hunt for a new challenge to throw its strength at. In 2004, IBM Research manager Charles Lickel, over dinner with coworkers, noticed that the restaurant they were in had fallen silent. He soon discovered the cause of this evening hiatus: Ken Jennings, who was then in the middle of his successful 74-game run on Jeopardy!. Nearly the entire restaurant had piled toward the televisions, mid-meal, to watch the phenomenon. Intrigued by the quiz show as a possible challenge for IBM, Lickel passed the idea on, and in 2005, IBM Research executive Paul Horn backed Lickel up, pushing for someone in his department to take up the challenge of playing Jeopardy! with an IBM system. Though he initially had trouble finding any research staff willing to take on what looked to be a much more complex challenge than the wordless game of chess, eventually David Ferrucci took him up on the offer.[26] In competitions managed by the United States government, Watson's predecessor, a system named Piquant, was usually able to respond correctly to only about 35% of clues and often required several minutes to respond.[26] To compete successfully on Jeopardy!, Watson would need to respond in no more than a few seconds, and at that time, the problems posed by the game show were deemed to be impossible to solve.[11]

In initial tests run during 2006 by David Ferrucci, the senior manager of IBM's Semantic Analysis and Integration department, Watson was given 500 clues from past Jeopardy! programs. While the best real-life competitors buzzed in half the time and responded correctly to as many as 95% of clues, Watson's first pass could only get about 15% correct. During 2007, the IBM team was given three to five years and a staff of 15 people to solve the problems.[11] By 2008, the developers had advanced Watson such that it could compete with Jeopardy! champions.[11] By February 2010, Watson could beat human Jeopardy! contestants on a regular basis.[27]

[edit] Competing on Jeopardy!

[edit] Preparation

The final results of Ken Jennings, Watson and Brad Rutter

In 2008, IBM representatives communicated with Jeopardy! executive producer Harry Friedman about the possibility of having Watson compete against Ken Jennings and Brad Rutter, two of the most successful contestants on the show, and the program's producers agreed.[11][28] Watson's differences with human players had generated conflicts between IBM and Jeopardy! staff during the planning of the competition.[24] IBM repeatedly expressed concerns that the show's writers would exploit Watson's cognitive deficiencies when writing the clues, thereby turning the game into a Turing test. To counter IBM's claim of bias, the Jeopardy! staff generated their clues by allowing a third party to randomly pick clues from previously written shows.[24] Jeopardy! staff also showed concerns over Watson's reaction time on the buzzer. Originally Watson signaled electronically, but show staff requested that it press a button physically, as the human contestants would.[26] Even with a robotic "finger" pressing the buzzer, Watson remained faster than its human competitors. Ken Jennings noted, "If you're trying to win on the show, the buzzer is all," and that Watson "can knock out a microsecond-precise buzz every single time with little or no variation. Human reflexes can't compete with computer circuits in this regard."[25][29][30] Stephen Baker, a journalist who recorded Watson's development, reported that the conflict between IBM and Jeopardy! became so serious that the competition was almost canceled during May 2010.[24]

As part of the preparation, IBM constructed a mock set in a conference room at one of its technology sites to model the one used on Jeopardy! Human players, including former Jeopardy! contestants, also participated in mock games against Watson with Todd Alan Crain of The Onion playing host.[11] About 100 test matches were conducted with Watson winning 65% of the games.[31]

To provide a physical presence in the televised games, Watson was represented by an "avatar" of a globe, inspired by the IBM "smarter planet" symbol. Forty-two colored threads criss-crossed the globe, to represent Watson's state of thought; the number 42 was an in-joke referring to the novel The Hitchhiker's Guide to the Galaxy.[18] Joshua Davis, the artist who designed the avatar for the project, explained to Stephen Baker that there are 36 triggerable states that Watson was able to use throughout the game to show its confidence in responding to a clue correctly; he had hoped to be able to find forty-two, to add another level to the Hitchhiker's Guide reference, but he was unable to pinpoint enough game states.[26]

[edit] Practice match

In a practice match before the press on January 13, 2011, Watson won a 15-question round against Ken Jennings and Brad Rutter with a score of $4,400 to Jennings's $3,400 and Rutter's $1,200, though Jennings and Watson were tied before the final $1,000 question. None of the three players responded incorrectly to a clue.[32]

[edit] First match

The first round was broadcast February 14, 2011. The right to choose first had been determined by a drawing, and went to Rutter.[33] Watson, represented by a computer monitor display and artificial voice, responded correctly to the first clue and then selected the fourth clue of the first category, a deliberate strategy to find the Daily Double as quickly as possible.[34] Watson's guess at the Daily Double location was correct. At the end of the first round, Watson was tied with Rutter at $5,000; Jennings had $2,000.[33]

Watson's performance was characterized by some quirks. In one instance, Watson repeated a reworded version of an incorrect response offered by Jennings. Because Watson could not recognize other contestants' responses, it did not know that Jennings had already given the same response. In another instance, Watson was initially given credit for a response of "What is leg?" after Jennings incorrectly responded "What is a missing hand?" to a clue about George Eyser. (The correct response was, "What is a missing leg?") Because Watson, unlike a human, could not have been responding to Jennings's mistake, it was decided that this response was incorrect. The broadcast version of the episode was edited to omit Trebek's original acceptance of Watson's response.[35] Watson also demonstrated complex wagering strategies on the Daily Doubles, with one bet at $6,435 and another at $1,246.[36] Gerald Tesauro, one of the IBM researchers who worked on Watson, explained that Watson's wagers were based on its confidence level for the category and a complex regression model called the Game State Evaluator.[37]

Watson took a commanding lead in Double Jeopardy!, correctly responding to both Daily Doubles. Watson responded to the second Daily Double correctly with a 32% confidence score.[36]

Although it wagered only $947 on the clue, Watson was the only contestant to miss the Final Jeopardy! response in the category U.S. CITIES ("Its largest airport was named for a World War II hero; its second largest, for a World War II battle"). Rutter and Jennings gave the correct response of Chicago, but Watson's response was "What is Toronto?????"[36][38][39] Ferrucci offered reasons why Watson would have guessed the Canadian city: categories only weakly suggest the type of response desired (for example, the clue may have asked about some fact regarding a U.S. city), the phrase "U.S. city" didn't appear in the question, there are cities named Toronto in the U.S., and an American League baseball team is located in Toronto.[40] Dr. Chris Welty, who also worked on Watson, suggested that it may not have been able to correctly parse the second part of the clue, "its second largest, for a World War II battle" (which was not a standalone clause despite it following a semicolon, and required context to understand that it was referring to a second-largest airport).[41] Eric Nyberg, a professor at Carnegie Mellon University and a member of the development team, stated that the error occurred because Watson does not possess the comparative knowledge to discard that potential response as not viable.[39]

The game ended with Jennings with $4,800, Rutter with $10,400, and Watson with $35,734.[36]

[edit] Second match

During the introduction, Trebek (a Canadian native) joked that he had learned Toronto was a U.S. city, and Watson's error in the first match prompted an IBM engineer to wear a Toronto Blue Jays jacket to the recording of the second match.[42]

In the first round, Jennings was finally able to choose a Daily Double clue,[43] while Watson responded to one Daily Double clue incorrectly for the first time in the Double Jeopardy! Round.[44] After the first round, Watson placed second for the first time in the competition after Rutter and Jennings were briefly successful in increasing their dollar values before Watson could respond.[44][45] Nonetheless, the final result ended with a victory for Watson with a score of $77,147, besting Jennings who scored $24,000 and Rutter who scored $21,600.[46]

The prizes for the competition were $1 million for first place (Watson), $300,000 for second place (Jennings), and $200,000 for third place (Rutter). As promised, IBM will donate 100% of Watson's winnings to charity, with 50% of those winnings going to World Vision and 50% going to World Community Grid.[47] Likewise, Jennings and Rutter will donate 50% of their winnings to their respective charities.[48]

In acknowledgment of IBM and Watson's achievements, Jennings made an additional remark in his Final Jeopardy! response: "I for one welcome our new computer overlords", echoing a memetic reference to the episode "Deep Space Homer" on The Simpsons.[49][50] Jennings later wrote an article for Slate, in which he stated "IBM has bragged to the media that Watson's question-answering skills are good for more than annoying Alex Trebek. The company sees a future in which fields like medical diagnosis, business analytics, and tech support are automated by question-answering software like Watson. Just as factory jobs were eliminated in the 20th century by new assembly-line robots, Brad and I were the first knowledge-industry workers put out of work by the new generation of "thinking" machines. "Quiz show contestant" may be the first job made redundant by Watson, but I'm sure it won't be the last."[18]

[edit] Future uses

According to IBM, "The goal is to have computers start to interact in natural human terms across a range of applications and processes, understanding the questions that humans ask and providing answers that humans can understand and justify."[27]

IBM and Nuance Communications Inc. are partnering for the research project to develop a commercial product during the next 18 to 24 months that will exploit Watson’s capabilities to aid the diagnosis and treatment of patients. Physicians at Columbia University are helping identify critical issues in the practice of medicine where the Watson technology may be able to contribute and physicians at the University of Maryland are working to identify the best way that a technology like Watson could interact with medical practitioners to provide the maximum assistance.[51] It has been also suggested by Robert C. Weber, IBM's general counsel, that Watson may be used for legal research.[52]

Watson is based on commercially available IBM Power 750 servers that have been marketed since February 2010. IBM also intends to market the DeepQA software to large corporations, with a price in the millions of dollars, reflecting the $1 million needed to acquire the complete system that operates Watson. IBM expects the price to decrease substantially within a decade as the technology improves.[11]

Commentator Rick Merritt said that "there's another really important reason why it is strategic for IBM to be seen very broadly by the American public as a company that can tackle tough computer problems. A big slice of Big Blue's pie comes from selling to the U.S. government some of the biggest, most expensive systems in the world."[53]

[edit] See also

[edit] References

  1. ^ IBM Watson: The Face of Watson at YouTube
  2. ^ a b c DeepQA Project: FAQ, IBM Corporation, http://www.research.ibm.com/deepqa/faq.shtml, retrieved 2011-02-11 
  3. ^ a b Hale, Mike (2011-02-08), Actors and Their Roles for $300, HAL? HAL!, The New York Times, http://www.nytimes.com/2011/02/09/arts/television/09nova.html 
  4. ^ The DeepQA Project, Research.ibm.com, http://www.research.ibm.com/deepqa/deepqa.shtml, retrieved 2011-02-18 
  5. ^ Markoff, John (2009-04-26), "Computer Program to Take On 'Jeopardy!'", The New York Times, http://www.nytimes.com/2009/04/27/technology/27jeopardy.html, retrieved 2009-04-27 
  6. ^ Loftus, Jack (2009-04-26), IBM Prepping Soul-Crushing 'Watson' Computer to Compete on Jeopardy!, Gizmodo, http://gizmodo.com/#!5228887/ibm-prepping-soul+crushing-watson-computer-to-compete-on-jeopardy, retrieved 2009-04-27 
  7. ^ IBM's "Watson" Computing System to Challenge All Time Greatest Jeopardy! Champions, Sony Pictures, 2010-12-14, http://www.jeopardy.com/news/watson1x7ap4.php, retrieved 2010-12-15 
  8. ^ a b c Jackson, Joab (2011-02-17), IBM Watson Vanquishes Human Jeopardy Foes, PC World, http://www.pcworld.com/businesscenter/article/219893/ibm_watson_vanquishes_human_jeopardy_foes.html, retrieved 2011-02-17 
  9. ^ a b Zimmer, Ben (2011-02-17), Is It Time to Welcome Our New Computer Overlords?, The Atlantic, http://www.theatlantic.com/technology/archive/2011/02/is-it-time-to-welcome-our-new-computer-overlords/71388/, retrieved 2011-02-17 
  10. ^ Raz, Guy (2011-01-28), Can a Computer Become a Jeopardy! Champ?, National Public Radio, http://www.npr.org/2011/01/08/132769575/Can-A-Computer-Become-A-Jeopardy-Champ, retrieved 2011-02-18 
  11. ^ a b c d e f g h i j k l m n o p q r s t Thompson, Clive (2010-06-16). "Smarter Than You Think: What Is I.B.M.’s Watson?". The New York Times Magazine. https://www.nytimes.com/2010/06/20/magazine/20Computer-t.html. Retrieved 2011-02-18. 
  12. ^ Is Watson the smartest machine on earth?, Computer Science and Electrical Engineering Department, UMBC, 2011-02-10, http://www.cs.umbc.edu/2011/02/is-watson-the-smartest-machine-on-earth/, retrieved 2011-02-11 
  13. ^ Takahashi, Dean (2011-02-17), IBM researcher explains what Watson gets right and wrong, VentureBeat, http://venturebeat.com/2011/02/17/ibm-researcher-explains-what-watson-gets-right-and-wrong/, retrieved 2011-02-18 
  14. ^ Novell (2011-02-02), Watson Supercomputer to Compete on 'Jeopardy!' -- Powered by SUSE Linux Enterprise Server on IBM POWER7, The Wall Street Journal, http://online.wsj.com/article/PR-CO-20110202-906855.html, retrieved 2011-02-21 
  15. ^ a b Rennie, John (2011-02-14), How IBM’s Watson Computer Excels at Jeopardy!, PLoS blogs, http://blogs.plos.org/retort/2011/02/14/how-ibm%E2%80%99s-watson-computer-will-excel-at-jeopardy/, retrieved 2011-02-19 
  16. ^ Lucas, Mearian (2011-02-21), Can anyone afford an IBM Watson supercomputer? (Yes), Computerworld, http://www.computerworld.com/s/article/9210381/Can_anyone_afford_an_IBM_Watson_supercomputer_Yes_?taxonomyId=67&pageNumber=2, retrieved 2011-02-21 
  17. ^ Ferrucci, D, et al. (2010), "Building Watson: An Overview of the DeepQA Project", AI Magazine (AI Magazine.) 31 (3), http://www.stanford.edu/class/cs124/AIMagzine-DeepQA.pdf 
  18. ^ a b c Jennings, Ken (2011-02-16), My Puny Human Brain, Slate, Newsweek Interactive Co. LLC, http://www.slate.com/id/2284721/, retrieved 2011-02-17, "In the middle of the floor was a huge image of Watson's on-camera avatar, a glowing blue ball crisscrossed by "threads" of thought—42 threads, to be precise, an in-joke for Douglas Adams fans." 
  19. ^ Libresco, Leah Anthony (2011-02-21), A Non-Trivial Advantage for Watson, The Huffington Post, http://www.huffingtonpost.com/leah-anthony-libresco/a-nontrivial-advantage-fo_b_825837.html, retrieved 2011-02-21 
  20. ^ "Will Watson Win On Jeopardy!?", Nova ScienceNOW (Public Broadcasting Service), 2011-01-20, http://www.pbs.org/wgbh/nova/tech/will-watson-win-jeopardy.html, retrieved 2011-01-27 
  21. ^ David, David (2011-01-10), "How Watson “sees,” “hears,” and “speaks” to play Jeopardy!", IBM Research blog (IBM), http://ibmresearchnews.blogspot.com/2010/12/how-watson-sees-hears-and-speaks-to.html, retrieved 2011-02-21 
  22. ^ IBM's 'Watson' to take on Jeopardy! champs, AFP, 2011-02-11, http://www.google.com/hostednews/afp/article/ALeqM5jOUJ_FGtwE3OlCFaOorNa3RuV_cQ?docId=CNG.1aa3e1ece3aedbb76228b6bc8e4c385e.1b1, retrieved 2011-02-19 
  23. ^ "Anything Goes!!® Internationally Syndicated Radio", Anything Goes!!, http://www.anythinggoesradio.com/, retrieved 2011-02-15 
  24. ^ a b c d Needleman, Rafe (2011-02-18), Reporters' Roundtable: Debating the robobrains, CNET, http://chkpt.zdnet.com/chkpt/1pcast.roundtable/http://podcast-files.cnet.com/podcast/cnet_roundtable_021811.mp3, retrieved 2011-02-18 
  25. ^ a b "Jeopardy! Champ Ken Jennings", The Washington Post, 2011-02-15, http://live.washingtonpost.com/jeopardy-ken-jennings.html, retrieved 2011-02-15 
  26. ^ a b c d Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. pp. 6-8. ISBN 0547483163. 
  27. ^ a b Brodkin, Jon (2010-02-10), IBM's Jeopardy-playing machine can now beat human contestants, Network World, http://www.networkworld.com/news/2010/021010-ibm-jeopardy-game.html?hpg1=bn, retrieved 2011-02-19 
  28. ^ Stelter, Brian (2010-12-14), "I.B.M. Supercomputer 'Watson' to Challenge 'Jeopardy' Stars", The New York Times, http://mediadecoder.blogs.nytimes.com/2010/12/14/i-b-m-supercomputer-watson-to-challenge-jeopardy-stars/, retrieved 2010-12-14, "An I.B.M. supercomputer system named after the company's founder, Thomas J. Watson Sr., is almost ready for a televised test: a bout of questioning on the quiz show "Jeopardy." I.B.M. and the producers of "Jeopardy" will announce on Tuesday that the computer, "Watson," will face the two most successful players in "Jeopardy" history, Ken Jennings and Brad Rutter, in three episodes that will be broadcast Feb. 14–16, 2011." 
  29. ^ Flatow, Ira (2011-02-11), "IBM Computer Faces Off Against 'Jeopardy' Champs", Talk of the Nation (National Public Radio), http://www.npr.org/2011/02/11/133686004/IBM-Computer-Faces-Off-Against-Jeopardy-Champs, retrieved 2011-02-15 
  30. ^ Alex Strachan (2011-02-12), "For Jennings, it's a man vs. man competition", The Vancouver Sun, http://www.vancouversun.com/entertainment/Jennings+competition/4270840/story.html, retrieved 2011-02-15 
  31. ^ Sostek, Anya (2011-02-13), Human champs of 'Jeopardy!' vs. Watson the IBM computer: a close match, Pittsburgh Post Gazette, http://www.post-gazette.com/pg/11044/1125163-96.stm, retrieved 2011-02-19 
  32. ^ Dignan, Larry (2011-1-13), IBM's Watson wins Jeopardy practice round: Can humans hang?, ZDnet, http://www.zdnet.com/blog/btl/ibms-watson-wins-jeopardy-practice-round-can-humans-hang/43601, retrieved 2011-01-13 
  33. ^ a b "The Jeopardy! Challenge". Jeopardy. February 14, 2011. No. 23, season 27.
  34. ^ Lenchner, Jon (2011-02-03), "Knowing what it knows: selected nuances of Watson's strategy", IBM Research blog (IBM), http://ibmresearchnews.blogspot.com/2011/02/knowing-what-it-knows-selected-nuances.html, retrieved 2011-02-16 
  35. ^ Johnston, Casey (2011-02-15), Jeopardy: IBM's Watson almost sneaks wrong answer by Trebek, Ars Technica, http://arstechnica.com/media/news/2011/02/ibms-watson-tied-for-1st-in-jeopardy-almost-sneaks-wrong-answer-by-trebek.ars, retrieved 2011-02-15 
  36. ^ a b c d Computer crushes the competition on 'Jeopardy!', Associated Press, 2011-02-15, http://www.google.com/hostednews/ap/article/ALeqM5jwVBxDQvVKEwk_czuv8Q4jxdU1Sg?docId=2e3e918f552b4599b013b4cc473d96af, retrieved 2011-02-19 
  37. ^ Tesauro, Gerald (2011-02-13), "IBM Research: Watson’s wagering strategies", IBM Research blog (IBM), http://ibmresearchnews.blogspot.com/2011/02/watsons-wagering-strategies.html, retrieved 2011-02-18 
  38. ^ Staff (2011-02-15), IBM's computer wins 'Jeopardy!' but... Toronto?, CTV.ca, http://www.ctv.ca/CTVNews/Entertainment/20110215/watson-jeopardy-final-toronto-110215/, retrieved 2011-02-15 
  39. ^ a b Robertson, Jordan; Borenstein, Seth (2011-02-16), "For Watson, Jeopardy! victory was elementary", The Globe and Mail, http://www.theglobeandmail.com/news/world/americas/for-watson-jeopardy-victory-was-elementary/article1910735/, retrieved 2011-02-17, "A human would have considered Toronto and discarded it because it is a Canadian city, not a U.S. one, but that's not the type of comparative knowledge Watson has, Prof. Nyberg said." 
  40. ^ Hamm, Steve (2011-02-15), Watson on Jeopardy! Day Two: The Confusion over and Airport Clue, A Smart Planet Blog, http://asmarterplanet.com/blog/2011/02/watson-on-jeopardy-day-two-the-confusion-over-an-airport-clue.html, retrieved 2011-02-21 
  41. ^ Johnston, Casey (2011-02-15), Creators: Watson has no speed advantage as it crushes humans in Jeopardy, Ars Technica, http://arstechnica.com/media/news/2011/02/creators-watson-has-no-speed-advantage-as-it-crushes-humans-in-jeopardy.ars, retrieved 2011-02-21 
  42. ^ Oberman, Mira (2011-02-17), Computer creams human Jeopardy! champions, Vancouver Sun, http://www.vancouversun.com/business/technology/Computer+creams+human+Jeopardy+champions/4300293/story.html, retrieved 2011-02-17, "But a Final Jeopardy flub prompted one IBM engineer to wear a Toronto Blue Jays jacket to the second day of taping and Trebek to joke that he'd learned Toronto was a U.S. city." 
  43. ^ Johnston, Casey (2011-02-17), Bug lets humans grab Daily Double as Watson triumphs on Jeopardy, Ars Technic, http://arstechnica.com/media/news/2011/02/bug-lets-humans-grab-daily-double-as-watson-triumphs-on-jeopardy.ars, retrieved 2011-02-21 
  44. ^ a b Upbin, Bruce (2011-02-17), IBM’s Supercomputer Watson Wins It All With $367 Bet, Forbes, http://blogs.forbes.com/bruceupbin/2011/02/16/watson-wins-it-all-with-367-bet/, retrieved 2011-02-21 
  45. ^ Oldenburg, Ann (2011-02-17), Ken Jennings: 'My puny brain' did just fine on 'Jeopardy!', USA Today, http://content.usatoday.com/communities/entertainment/post/2011/02/ken-jennings-my-puny-brain-did-just-fine-on-jeopardy-/1, retrieved 2011-02-21 
  46. ^ Moore, Frazier (2011-02-16), Spoiler Alert: 'Jeopardy!' Man vs. Machine Tourney Concludes, Yahoo! TV Blog, http://tv.yahoo.com/blog/spoiler-alert-jeopardy-man-vs-machine-tourney-concludes--2385, retrieved 2011-02-18 
  47. ^ World Community Grid to benefit from Jeopardy! competition, World Community Grid, 2011-02-04, http://www.worldcommunitygrid.org/about_us/viewNewsArticle.do?articleId=148, retrieved 2011-02-19 
  48. ^ Jeopardy! And IBM Announce Charities To Benefit From Watson Competition, IBM Corporation, 2011-01-13, http://www-03.ibm.com/press/us/en/pressrelease/33373.wss, retrieved 2011-02-19 
  49. ^ IBM's Watson supercomputer crowned Jeopardy king, BBC News, 2011-02-17, http://www.bbc.co.uk/news/technology-12491688, retrieved 2011-02-17 
  50. ^ Markoff, John (2011-02-16), Computer Wins on ‘Jeopardy!’: Trivial, It’s Not, Yorktown Heights, New York: The New York Times, http://www.nytimes.com/2011/02/17/science/17jeopardy-watson.html, retrieved 2011-02-17 
  51. ^ Wakeman, Nick (2011-02-17), IBM's Watson heads to medical school, http://washingtontechnology.com/articles/2011/02/17/ibm-watson-next-steps.aspx, retrieved 2011-02-19 
  52. ^ Weber, Robert C. (2011-02-14), Why 'Watson' matters to lawyers, The National Law Journal, http://www.law.com/jsp/nlj/PubArticleNLJ.jsp?id=1202481662966&slreturn=1&hbxlogin=1, retrieved 2011-02-18 
  53. ^ Merritt, Rick (2011-02-14), IBM playing Jeopardy with tax dollars, EE Times, http://www.eetimes.com/electronics-news/4213145/IBM-playing-Jeopardy-with-tax-dollars, retrieved 2011-02-19 

[edit] External links

[edit] J! Archive

[edit] Videos

Personal tools
Namespaces
Variants
Views
Actions

Go there...
http://en.wikipedia.org/wiki/Watson_(artificial_intelligence_software)#Videos

Watson supercomputer that appeared on Jeopardy last week was made up of 90 IBM Power 750 Express server
Can anyone afford an IBM Watson supercomputer? (Yes) - Computerworld
Watson supercomputer that appeared on Jeopardy last week was made up of 90 IBM Power 750 Express server - Google Search
Watson_(artificial_intelligence_software)
Smartest Machine on Earth | DocumentaryStorm - Stream Full Documentaries
YouTube - IBM and the Jeopardy Challenge
Don

No comments:

Post a Comment