the issue is that you have millions or billions of nodes and no human is capable of understanding what they all do and why
then why the fuck arenât you trying to understand them
if the intent was actual research and not the production of empty capitalistic hype, youâd be trying to demystify the damn processes
scale
scale
lots of people are trying to understand them, itâs just that this is, like, hard
thatâs what i mean, orange; iâm aware of the data science, but itâs still willing blindness to not even attempt to meaningfully analyse the data and instead to simply assume that if you throw enoughof peopleâs tolen data at it, itâll become good
at a very basic level neural networks arenât complicated
you basically just multiply and add lol
the issue is that you do those operations a metric fuck ton of times, and each individual operation has no discernible meaning
arete iâve been paying attention to the Top AI Researchers at Google and OpenAI and such
they are not actually trying to understand how the machines work
hear me out. use ai to understand ai
all theyâre trying to do is trying to make them bigger, not to actually make any meaningful advancements in the field
unironically this is basically the only viable path
real
I have a feeling thatâs an idea already used
Yeah
I have a feeling you get no bitches
Who
yes thatâs already been tried it doesnât work because the machine youâre using to analyse the neural net is just as useless as the original neural net
Thank goodness
sigmund freud aka john projection
analysing our machine models with other machine models would be good if those models were actually good at what they do. but they are profoundly shit at what they do