Computer History Museum - A visit

As a child, I used to imagine that when I grow up I will work in a bookstore. I will sell books and in spare time read all the books. Whenever I get a chance to visit a big bookstore or Library, I feel as If I have entered heaven, my spine is thrilled with electric spikes.
This August, I got a chance to visit San Jose (Was there to attend an international conference ASONAM 2016 in San Francisco). Feeling a bit adventurous I tried Airbnb for the first time and stayed with Venna, a very sweet lady. When we reached, she suggested that since I am into computers I should visit Computer History Museum, I accepted her suggestion and am very happy that I did. The museum kindled the child in me, and once again I was back to childhood (not that I have really grown old), seeing the Babbage difference engine, the PDP-1, the Jacquard loom, I was transcended into a new world. The museum showcased the history of computers bringing them back to life through their gigantic exhibits. The exhibition Rev…

Deep Blue: A review

Image Source:
Paper: Deep Blue
Authors: Murray Campbell, A Joseph Hoane Jr, feng-hsiung Hsu

Deep Blue was the chess machine (it is a powerful combination of hardware and software) developed by IBM. The machine became important in the field of AI because it was able to defeat the then World class Chess champion Garry Kasparov in the year 1997. This paper gives insight into the impressive hardware & software of Deep Blue.

Key Points:
Deep Blue learned from the experience of its predecessors ChipTest, Deep Thought, Deep Thought II and Deep Blue-I matches with human players. The major changes in the Deep Blue architecture were:
  • Hardware Implemented Evaluation function: It's chess chip had a redesigned evaluation function based on over 8000 features. Since the evaluation function was implemented in hardware it offered constant execution time, but it also limited future feature additions. A majority of the features and weights of the evaluation function were created and tuned by hand.
  • Processor: The whole system was composed of 30 node IBM RS/600 SP Computer, had 480 single chip search engines. The computer system had 28 nodes with 120 MHz P2SC processors and 2 nodes with 135 MHz P@SC processors.  They communicated with each other via a high-speed switch. Each SP processor had 16 chess chips. And finally, all nodes had 1GB of RAM and 4GB of the disk. The system ran the AIX 4.2 Operating system. The Chips also supported the use of an external FPGA to allow more complicated search control, additional terms for the evaluation function and an external transposition table. Each chess chips in Deep Blue could search 2 to 2.5 million chess positions per second.
  • State Space Search:   Deep Blue search combined both hardware and software (Hybrid search). The software search was implemented in C, and the hardware search encoded on the chess chips. It had roughly 500 parallel processors available to participate in the tree search. Deep Blue worked at three layers. First layer master, then workers followed by chess chips. One of the SP processor designated as master searches the top levels of the chess game tree, the remainder processors, designated workers, explore the ‘leaf’ positions. At the third level, it had the chess chips which search the last few levels of the state space tree. Deep blue used quiescence search, iterative deepening, transposition tables and Negascout.  The software search along the state space was non-uniform. It was based on the principles of extending forcing/ forced pairs of moves, fractional extensions, delayed extension, dual credit and preserve the search envelope.
  • Deep Blue kept Opening book, extended book, and endgame databases to reduce search complexity when possible. Further, it had ‘time control’ implemented with the help of two-time targets to ensure feasible time response.

To quote authors, “The success of Deep Blue in 1997 was not the result of any one factor”. All of the above contributed. Yet, there were various areas which could have been improved for instance parallel search efficiency, better pruning mechanisms etc. The paper explained the rationale behind the design choices it made and at the same time pointed at the alternatives future explorers can take.
I would end my review by modifying the quote of Neil Armstrong, "That's one small step for the machine, one giant leap for mankind/cyborgs."