Browsed by
Category: Future Forecasts

Third Stage Boost – Part 2: Implications of Neuromorphic Computing

Third Stage Boost – Part 2: Implications of Neuromorphic Computing

Neuromorphic Computing: Statistical Mechanics & Criticality   Last week, I suggested that we were on the verge of something new, and referenced an article by von Bubnoff: A brain built from atomic switches [that] can learn, together with the follow-on article Brain Built on Switches. The key innovation described in this article was a silver mesh, as shown in the following figure. This mesh is a “network of microscopically thin intersecting silver wires,” grown via a combination of electrochemical and…

Read More Read More

Third Stage Boost: Statistical Mechanics and Neuromorphic Computing – Part 1

Third Stage Boost: Statistical Mechanics and Neuromorphic Computing – Part 1

Next-Generation Neural Network Architectures: More Brain-Like   Three generations of artificial intelligence.. The third generation is emerging … right about … now. That’s what is shown in this figure, presented in log-time scale. Brief history of AI in log-time scale The first generation of AI, symbolic AI, began conceptually around 1954, and lasted until 1986; 32 years. On the log-time scale shown in the figure above, this entire era takes place under the first curve; the black bell-shaped curve on…

Read More Read More

Future Forecasts: How We’ll Mind-Control Ourselves

Future Forecasts: How We’ll Mind-Control Ourselves

Tweaking Our Own Mental State: Getting Easier All the Time   This last spring, twelve minutes changed my life forever. I got into a heck of a fistfight. I went into a dark cave, and put on an alternate identity and transformed into the baddest-a** thing around. I had one of the most spiritual, exalted, uplifting experiences that I’ve ever had. And I fell in love. So here’s the story. I was at the NVIDIA GTC (GPU Technology Conference) this…

Read More Read More

2025 and Beyond

2025 and Beyond

Artificial Intelligence and Jobs by the Year 2025: One of my biggest take-aways from the recent (May, 2017) NVIDIA GTC (GPU Technology Conference) was less about the technology, and more about the near-term jobs impact of artificial intelligence (AI) and robotics. Making smart education and career decisions is crucial, as the emerging combination of AI and robotics will have a huge impact on jobs. Those of you studying artificial intelligence, deep learning, and neural networks will have a stronger career…

Read More Read More

Deep Learning: The Fast Evolution of Artificial Intelligence

Deep Learning: The Fast Evolution of Artificial Intelligence

Just one of the slides from a presentation that I’m working up for an upcoming online presentation at Northwestern University, but it tells the story. Just one more thought: here’s the rapid pace of evolution within just the image analysis realm of AI, largely due to multiple layers (sometimes, many, many, MANY multiple layers) of networks, a good fraction of which are Convolutional Neural Networks, or CNNs. Error rates have dropped from over 15% to about 3% within just four…

Read More Read More

GPUs, CPUs, MIPS, and Brain-Based Computation

GPUs, CPUs, MIPS, and Brain-Based Computation

GPUs, CPUs, MIPS, and Brain-Based Computation Quick links to useful diagrams: Michael Galloy has produced a good chart showing increase in GPU vs CPU processing over this past decade; nicely continues the line of thought about nonlinear increases in processing power. Look at: http://michaelgalloy.com/2013/06/11/cpu-vs-gpu-performance.html See also post by Karl Rupp: http://www.karlrupp.net/2013/06/cpu-gpu-and-mic-hardware-characteristics-over-time/ Also, this post by NVIDIA: http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter29.html For detailed discussion (including appropriate algorithms/methods), but NOT figures, see: http://pcl.intel-research.net/publications/isca319-lee.pdf Debunking the 100X GPU vs. CPU Myth: An Evaluation of Throughput Computing…

Read More Read More

Modeling the Future: Tools from Complex Systems

Modeling the Future: Tools from Complex Systems

2012 and Beyond: Tools for Predicting the Next Sixty Years Is the world coming to an end on Dec. 21st, 2012, or not? Very likely, not. We’ll still wake up in the morning, in the same beds in which we went to sleep in the night before. We’ll still walk out to our cars, or get to our Metro stations, on time. And we’ll likely stop for the same “cup of joe” on the way to work. But will our…

Read More Read More

Good Read on Modeling Social Emergent Phenomena – But Still Not There Yet!

Good Read on Modeling Social Emergent Phenomena – But Still Not There Yet!

Philip Ball – Critical Mass The most important thing we can do right now – given the huge changes ahead of us – both in society, the world, and technology – is to get some sort of “handle” on what’s coming up. By that, I mean a good set of models. And as a result, I’m on a search for good models. Those that I know, those that are new. Those that make sense, and those that don’t. (We need…

Read More Read More

Modeling Trends in Long-Term IT as a Phase Transition

Modeling Trends in Long-Term IT as a Phase Transition

The most reasonable model for our faster-than-exponential growth in long-term IT trends is that of a phase transition. At a second-order phase transition, the heat capacity becomes discontinuous. The heat capacity image is provided courtesy of a wikipedia site on heat capacity transition(s). L. Witthauer and M. Diertele present a number of excellent computations in graphical form in their paper The Phase Transition of the 2D-Ising Model. There is another interesting article by B. Derrida & D. Stauffer in Europhysics…

Read More Read More

Going Beyond Moore’s Law

Going Beyond Moore’s Law

Super-Exponential Long-Term Trends in Information Technology Interesting read for the day: Super-exponential long-term trends in Information Technology by B. Nagy, J.D. Farmer, J.E. Trancik, & J.P. Gonzales, shows that which Kurzeil suggested in his earlier work on “technology singularities” is true: We are experiencing faster-than-exponential growth within the information technology area. Nagy et al. are careful to point out that their work indicates a “mathematical singularity,” not to be confused with the more broadly-sweeping notion of a “technological singularity” discussed…

Read More Read More