• nednobbins@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Every time I see posts like this I remember a frequent argument I had in the early 2000’s.

    Every time I talked with photography students (I worked at an art school) or a general photography enthusiast, I got the same smug predictions about digital photography. The resolution sucked, the color sucked, the artist doesn’t have enough control, etc. They all assured me that digital photography might be nice for casual vacation photos and maybe a few specialty applications but no way, no how, not even when hell freezes over would any serious photographer ever consider digital.

    At the time I would think back to my annoying grade school discussions with teachers who assured me that (dot matrix) printers just sucked. Serious writing was done by hand and if you didn’t know cursive you might as well be illiterate.

    For some reasons people keep forgetting that technology marches on. The dumb glitches that are so easy to make fun of now, will get addressed. There are billions of dollars pouring into AI development. Every major company and country is developing them. The pay rates for AI developer jobs attract huge amounts of people to solve those problems.

      • Honytawk@lemmy.zip
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        We have plenty of indication, when we look at past technologies that plenty thought to have plateaued still being improved.

        Didn’t Bill Gates think spam would been a thing of the past … in 2006.

        My junk folder disagrees.

  • WEE_WOO@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    It’s shocking to me how many people try to make cakes without fear. It just doesn’t taste the same without it

  • Nacktmull@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Why is it even called artificial intelligence, when it´s obviously just mindless artificial pattern reproduction?

    • BluesF@feddit.uk
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Machine Learning is such a better name. It describes what is happening - a machine is learning to do some specific thing. In this case to take text and output pictures… It’s limited by what it learned from. It learned from arrays of numbers representing colours of pixels, and from strings of text. It doesn’t know what that text means, it just knows how to translate it into arrays of numbers… There is no intelligence, only limited learning.

      • DroneRights [it/its]@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Machine Learning isn’t a good name for these services because they aren’t learning. You don’t teach them by interacting with them. The developers did the teaching and the machine did the learning before you ever opened the browser window. You’re interacting with the result of learning, not with the learning.

    • Honytawk@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      No, Dall-e 3, an AI known for creating brilliant pictures, but isn’t trained on general intelligence nor cooking, is the one creating bad recipes.

  • Default_Defect@midwest.social
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    1 year ago

    Some tech bro will attempt to make this cake and will tell someone it was better than anything some uppity WOKE human baker could have made, regardless of how bad it turned out.