- Aug 14, 2020
And again the hell it has to do how it will work on series X? There aren't dedicate hardware for machine learning. If it uses CUs too much for machine learning , CUs can't be used for other graphical tasks. Now I'm not saying we won't see any benefit but let's not pretend to see a new world of new perfomance just for thatAlso... You do realize Computer Scientists invented the very term Machine Learning - don't you? But you simply scoff at the idea of Machine Learning being a prime subject in Computer Science?
Yes it is in fact the main subject of nearly 5 books in totality spanning in total 7 books.
It's important people realize Machine Learning is not just about propagating data with more statistical efficiency, it is about programming software that is in fact actually able to code and create software far superior to what human's are capable. And then furthermore applying machine learning to words, architectures, material science - quiet literally every single physical entity will eventually be bolstered by Machine Learning and improved upon. Software optimization due to Machine Learning is merely topical and the tip of the iceberg here. According to Computer Science, the Era before Machine Learning is implemented in full (roughly January 3rd 2021) should be considered literally as the Dark Ages before Machine Learning became common.