• 0 Posts
  • 16 Comments
Joined 4 months ago
cake
Cake day: January 12th, 2026

help-circle
  • Like the “hasnt left the lab in 75 years” thorium reactors (Which current designs still need enriched uranium)? and the recycle reactors that produce weapons grade plutonium (Of course, also via enriched uranium)? Id love to see you

    No I dont mean those, I mean the CANDU’s, a viable system that has been operating for around the same amount of time thorium has been in development hell (again, 75 years).

    Are you trying to say america has never had a nuclear disaster on record? Cause its pretty easy to google that US has had more nuclear accidents in the 2000’s than canada has in the past century. The Three Mile Island meltdown was probably the worst nuclear accident in north america, its hardly reasonable to ignore it. Unless you count uranium mining accidents, cause then the Church Rock uranium mill takes the crown.

    And which country has ~2000 nuclear reactors? I must have missed this in my research, with those numbers they account for approximately 4x the total number of reactors in the world, a surprising oversight. (Or are you doing some football math that 94/19 = 100x? Cause even if 94/19=5x then per capita america is still lacking)




  • klankin@piefed.catoMemes@lemmy.mlsystemd
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    Admittedly it is pretty confusing, but its spec describes it as “json with functions”, and once you get a handle on the recursive aspect of it (and that it kinda smushes multiple imported jsons together), its not too bad.

    Stupid useful too



  • I mean its more like self driving cars than cars themselves; it can work, but also steering wheels were created by the devs for a reason - even if most are too lazy to understand that reason.

    Like I’d agree hand coding in assembly is (mostly) useless these days, but honestly I feel like the efficiency problems ai is trying to solve were largely solved 50 years ago with compilers.

    (and like isnt digesting large outputs the entire point of being an engineering level dev? like if youre just there to pray to the software gods, you’d do much better as a CRUD script kiddie anyways)






  • I largely agree, but I could see having a prebuild iso/img including uboot for most common boards being a lot more user friendly than doing it by hand.

    That and a binary cache could make things take a couple mins for a download vs a couple days to compile the kernel + all packages for any user with lower end hardware.

    Kinda like what armbian provides for the arm space, but with a lot harder initial curve by hand rolling their own distro.




  • klankin@piefed.catoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    Mid-range networking equiptment common in higher end homelabs or small/medium enterprises.

    Doesnt compete with fancier Cisco gear, but has an easy to use interface that can scale fairly well.

    Though like most networking equiptment the hardware is dirt cheap, so Alpine’s lightweight base fits it well.



  • We dont yet have proof AI can “imagine” new things, just interpolates between existing. For complex relationships such as realistic fluid/particle dynamics it also requires billions of inputs before approximating reasonable outputs - so the cost to potentially nonexistent ROI timeline just doesnt add up. Its made even worse if youre already simulating billions of viable simulations, just to generate thousands.

    This is why most modern techbro AI requires massive internet piracy, without already having the training data readily available (but not efficiently simulated) the algorithms arent worth much.

    Tangentially this is why such algorithms have many applications in the medical field, they generally have access to a large dataset of human annotated diagnosis that can’t readily be created by a computer.