The Dawn of the Age of Artificial Intelligence



The advances we’ve seen in the past few years—cars that drive themselves, useful humanoid robots, speech recognition and synthesis systems, 3D printers,Jeopardy!-champion computers—are not the crowning achievements of the computer era. They’re the warm-up acts. As we move deeper into the second machine age we’ll see more and more such wonders, and they’ll become more and more impressive.

How can we be so sure? Because the exponential, digital, and recombinant powers of the second machine age have made it possible for humanity to create two of the most important one-time events in our history: the emergence of real, useful artificial intelligence (AI) and the connection of most of the people on the planet via a common digital network.

Either of these advances alone would fundamentally change our growth prospects. When combined, they’re more important than anything since the Industrial Revolution, which forever transformed how physical work was done.

Thinking Machines, Available now

Digital machines have escaped their narrow confines and started to demonstrate broad abilities in pattern recognition, complex communication, and other domains that used to be exclusively human. We’ve recently seen great progress in natural language processing, machine learning (the ability of a computer to automatically refine its methods and improve its results as it gets more data), computer vision, simultaneous localization and mapping, and many other areas.

We’re going to see artificial intelligence do more and more, and as this happens costs will go down, outcomes will improve, and our lives will get better. Soon countless pieces of AI will be working on our behalf, often in the background. They’ll help us in areas ranging from trivial to substantive to life changing. Trivial uses of AI include recognizing our friends’ faces in photos and recommending products. More substantive ones include automatically driving cars on the road, guiding robots in warehouses, and better matching jobs and job seekers. But these remarkable advances pale against the life-changing potential of artificial intelligence.

Artificial intelligence : CAPTCHA ‘Turing Test’ is passed

Vicarious, a startup developing artificial intelligence software, has announced that its algorithms can now reliably solve modern CAPTCHAs, including Google’s reCAPTCHA, the world’s most widely used test of a machine’s ability to act human. A CAPTCHA (which stands for “Completely Automated Public Turing test to tell Computers and Humans Apart”) is considered broken if an algorithm is able to achieve a precision of at least 1%. L

everaging core insights from machine learning and neuroscience, the Vicarious AI can achieve success rates of up to 90% on modern CAPTCHAs from Google, Yahoo, PayPal,, and others. This advancement, the company says, renders text-based CAPTCHAs no longer effective as a Turing test. “Recent AI systems like IBM’s Watson and deep neural networks rely on brute force: connecting massive computing powerto massive datasets.


This is the first time this distinctively human act of perception has been achieved, and it uses relatively minuscule amounts of data and computing power. The Vicarious algorithms achieve a level of effectiveness and efficiency much closer to actual human brains”, said Vicarious co-founder D. Scott Phoenix. “Understanding how brain creates intelligence is the ultimate scientific challenge. Vicarious has a long-term strategy for developing human level artificial intelligence, and it starts with building a brain-like vision system. Modern CAPTCHAs provide a snapshot of the challenges of visual perception, and solving those in a general way required us to understand how the brain does it”, said co-founder Dr. Dileep George. Solving CAPTCHA is the first public demonstration of Recursive Cortical Network (RCN) technology. Although still many years away, the commercial applications of RCN will have broad implications for robotics, medical image analysis, image and video search, and many other fields. “We should not underestimate the significance of Vicarious crossing this milestone,” said Facebook co-founder and board member Dustin Moskovitz. “This is an exciting time for artificial intelligence research, and they are at the forefront of building the first truly intelligent machines.”

China’s Tianhe-2 still number 1

The 42nd edition of the twice-yearly TOP500 list was announced yesterday at the SC13 conference in Denver, Colorado. While a typical desktop PC has four cores, Tianhe-2 (which means “Milky Way 2”) features 3,120,000 – each using Intel’s “Ivy Bridge” 22 nanometre processors. It has 1,024,000 gigabytes of random-access memory (RAM), 12.4 petabytes of storage space and needs 17,800 kilowatts (kW) of electricity to work. Including external cooling, it requires 24,000 kW. The entire complex occupies 720 square metres of floor space and costs 2.4 billion Yuan (US$390 million).


China’s National University of Defence Technology (NUDT) – which developed Tianhe-2 – says it will be offered as a “research and education” tool once tests are completed. Local reports suggest that the car industry is a “priority” client, so it may be useful in complex engine simulations, for example, or devising new materials and more efficient components.

Titan – installed at the U.S. Department of Energy’s (DOE) Oak Ridge National Laboratory – remains the no. 2 system, achieving 17.59 petaflop/s on the Linpack benchmark. Titan is among the most energy efficient systems on the list, consuming a total of 8.21 MW of electrical power and delivering 2.14 gigaflops per watt, compared to 1.9 for Tianhe-2.

Sequoia, an IBM BlueGene/Q system installed at the DOE’s Lawrence Livermore National Laboratory, is the no. 3 system. First delivered in 2011, Sequoia reached 17.17 petaflop/s on the Linpack benchmark.

In all, there are 31 systems with performance greater than a petaflop/s on the list, an increase of five compared to the June 2013 list. Intel continues to provide the processors for the largest share (82.4 percent) of TOP500 systems.

Although China holds the no.1 spot, the U.S. is clearly the leading consumer of supercomputers, with 265 of the top 500 systems (253 last time). The European share (102 systems compared to 112 last time) is still lower than the Asian share (115 systems, down from 118 last time).

Like many forms of information technology, the growth of supercomputing power has followed a remarkably smooth and consistent trend. As shown in the graph below, we can expect to see the first exaflop machine by 2019. An exaflop is 1,000,000,000,000,000,000 (a million trillion, or a quintillion) calculations per second. Such computing power will be invaluable to researchers – providing faster and more accurate simulations of climate, weather, astrophysics, genetics, molecular dynamics and many other processes. Zettaflop machines could emerge by 2030.

IBM achieves new speed record for big data

The device pictured above is a new, ultra-fast, energy efficient analog-to-digital converter (ADC) presented this week at the International Solid-State Circuits Conference (ISSCC) in San Francisco. It can transfer Big Data between clouds and data centres four times faster than current technology. At this speed, 160 Gigabytes – or the equivalent of a two-hour, 4K ultra-high definition movie – could be downloaded in only a few seconds.


While only a lab prototype, a previous version of the design has been licensed to Semtech Corp, a leading supplier of analog and mixed-signal semiconductors. The company is using this technology to develop advanced communications platforms expected to be announced later this year.

As Big Data and Internet traffic continue to grow exponentially, future networking standards will have to support higher data rates. In 1992, for example, 100 gigabytes of data was transferred per day, whereas today, that figure has grown to over two exabytes daily, a 20 million-fold increase.

To support the increase in traffic, scientists at IBM Research and Ecole Polytechnique Fédérale de Lausanne (EPFL) have been developing ADC technology to enable complex digital equalization across long-distance fiber channels. An ADC converts analog signals to digital, approximating the right combination of zeros and ones to digitally represent data so it can be stored on computers and analysed for patterns.

IBM forms Watson Group


Came over very interesting Article about big investment in to AI , Watson group.

IBM today announced it will establish the IBM Watson Group, a new business unit dedicated to the development and commercialisation of cloud-delivered cognitive innovations. The move signifies a strategic shift by IBM to accelerate into the marketplace a new class of software, services and apps that can “think”, improve by learning, and discover answers and insights to complex questions from massive amounts of Big Data.

IBM will invest more than $1 billion into the Watson Group, focusing on research and development to bring cloud-delivered cognitive applications and services to market. This will include $100 million available for venture investments to support IBM’s recently launched ecosystem of start-ups and businesses, which are building a new class of cognitive apps powered by Watson, in the IBM Watson Developers Cloud.

Your Technology Updates

My Development Space

Your Technology Updates

Modem Noise

Your Technology Updates

Tails and Scéals

From factory to pub, an exaggerated blog about some hairy guy.


Your Technology Updates

Michael Meagher

my thoughts on my engagement with student and faculty in my role with Microsoft

Roncon Tech

A computer students insight into the world of technology

Densino Development

Your Technology Updates


Your Technology Updates