• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Major Breakthrough - Researchers Allow Desktop GPU to run Neural AI Simulation Capable of Simulating Brain Models of almost Unlimited Size.

The Desktop PC now becomes essentially as useful as a very powerful network in term's of
Brain Simulation Rendering

"Allows people outside academia to turn their gaming PC into a Supercomputer and run large neural networks.”"


At the time, computers were too slow for the method to be widely applicable meaning simulating large-scale
brain models has until now only been possible for a minority of researchers privileged to have access to supercomputer systems.

The researchers applied Izhikevich’s technique to a modern GPU, with approximately 2,000 times the computing power available
15 years ago, to create a cutting-edge model of a Macaque’s visual cortex (with 4.13 × 106 neurons and 24.2 × 109 synapse)
which previously could only be simulated on a supercomputer.

The researchers’ GPU accelerated spiking neural network simulator uses the large amount of computational power available
on a GPU to ‘procedurally’ generate connectivity and synaptic weights ‘on the go’ as spikes are triggered – removing the need
to store connectivity data in memory.

Initialization of the researchers’ model took six minutes and simulation of each biological second took 7.7 min in the ground
state and 8.4 min in the resting state– up to 35 % less time than a previous supercomputer simulation. In 2018, one rack of
an IBM Blue Gene/Q supercomputer initialization of the model took around five minutes and simulating one second of
biological time took approximately 12 minutes.

Prof Nowotny, Professor of Informatics at the University of Sussex, said: “Large-scale simulations of spiking neural network models
are an important tool for improving our understanding of the dynamics and ultimately the function of brains. However, even small
mammals such as mice have on the order of 1 × 1012 synaptic connections meaning that simulations require several terabytes of
data – an unrealistic memory requirement for a single desktop machine.

“This research is a game-changer for computational Neuroscience and AI researchers who can now simulate brain circuits on their
local workstations, but it also allows people outside academia to turn their gaming PC into a supercomputer and run large
neural networks.”
 
Last edited:
star wars lightning GIF by CORSAIR


I'm never gonna be able to upgrade or build a new PC.
 
If this turns your computer into a legitimate Super Computer for neural rendering - then it must bolster some facet of gaming
after applying this update. Certainly, ML will ensure it will some day in the future.
 
Last edited:
Top Bottom