• 0 Posts
  • 48 Comments
Joined 1 year ago
cake
Cake day: June 7th, 2023

help-circle

  • v_krishna@lemmy.mltoMemes@lemmy.mlMeema said F everybody else!
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 months ago

    99 in Berkeley right now. My house was built in like 1928 and since we rent obviously there’s been no energy efficient updating of insulation or anything like that since maybe the 60s. It’s like 94 inside right now. Sitting in front of multiple fans just blowing hot air at me this is the life y’all.













  • In the early 2000s plucking and waxing your brows to really really thin was fashionable, esp in certain places. Then people realized it is dumb but if you are around 40 now it was too late. So microblading helps you not look like a MadTV sketch. Source: my wife turns 40 this year, grew up in a particularly hood area of the sf bay area, and from 30 onwards really regretted plucking her eyebrows to almost nothing.


  • Dr bronners for skin and hair (I have very thick indian hair in jata style dreadlocks down to my knees). For a long time I used a charcoal based face wash (lush until they changed their formula for coal face, then some similar brand I found on amazon) but for whatever odd reason after a few years my skin stopped tolerating it and kept breaking out, so I switched to Kate Somerville’s sulfur face wash which works wonders (but does have a bit of a smell to it unfortunately).





  • I think that is overly simplistic. Embeddings used for LLMs do definitely include a concept of what things mean and the relationship of things to other things.

    E.g., compare the embeddings of Paris, Athens, and London to other cities and they will have small cosine distance between them. Compare France, Greece, and England and same. Then very interestingly, look at Paris - France, Athens - Greece, London - England and you’ll find the resulting vectors all align (fundamentally the vector operation seems to account for the relationship “is the capital of”). Then go a step further, compare those vector to Paris - US, Athens - US, London - Canada. You’ll see the previous set are not aligned with these nearly as much but these are aligned with each other (relationship being something like “is a smaller city in this countrry, named after a famous city in some other country”)

    The way attention works there is a whole bunch of semantic meaning baked into embeddings, and by comparing embeddings you can get to pragmatic meaning as well.