• Kajika@lemmy.mlOP
    link
    fedilink
    arrow-up
    85
    ·
    10 months ago

    Took me 2 hours to find out why the final output of a neural network was a bunch of NaN. This is always very annoying but I can’t really complain, it make sense. Just sucks.

    • flying_sheep@lemmy.ml
      link
      fedilink
      arrow-up
      18
      arrow-down
      2
      ·
      10 months ago

      I guess you can always just add an assert not data.isna().any() in strategic locations

      • Kajika@lemmy.mlOP
        link
        fedilink
        arrow-up
        31
        ·
        10 months ago

        That could be a nice way. Sadly it was in a C++ code base (using tensorflow). Therefore no such nice things (would be slow too). I skill-issued myself thinking a struct would be 0 -initialized but MyStruct input; would not while MyStruct input {}; will (that was the fix). Long story.

        • fkn@lemmy.world
          link
          fedilink
          arrow-up
          13
          arrow-down
          1
          ·
          10 months ago

          I too have forgotten to memset my structs in c++ tensorflow after prototyping in python.

        • TheFadingOne@feddit.de
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          10 months ago

          If you use the GNU libc the feenableexcept function, which you can use to enable certain floating point exceptions, could be useful to catch unexpected/unwanted NaNs

        • nickwitha_k (he/him)@lemmy.sdf.org
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          10 months ago

          Oof. This makes me appreciate the abstractions in Go. It’s a small thing but initializing structs with zero values by default is nice.