I assume there must be a reason why sign language is superior but I genuinely don’t know why.
American sign language is not a gesture based form of English. It is an entire language in its own right, with its own distinct grammar and vocabulary.
To someone deaf from birth, sign language is their native language. And it is much more comfortable to quickly read your native language than a second language.
This raises more questions than it answers, like how do the deaf from birth function in society at all if they struggle with other languages besides sign language. How do they get a job, go to school, learn new skills, read the news, text people? What do they do in their leisure if not watching subtitles movies or reading books? Many non-english speakers end up learning English anyway because of just how pervasive it is.
The same way anyone else for whom English is a second or third language function in society.
I’m ESL and use English subtitles when watching a programme in a language I can’t speak…
You’re on the internet. Most Deaf people these days read English fluently. It’s just that Deaf 70 year olds were often able to get away without becoming actively fluent in English and may not have felt the need to. Closed captions are younger than most people think and they fucking sucked fairly recently. I grew up watching the news with captions and it was distracting if you didn’t need it. Big black boxes with the words said a few seconds ago rapidly appearing on them as they covered stuff. And often captions on prerecorded content wasn’t much better. It was an accessibility feature and treated as such. Technology connections has a great video on closed captions that was almost nostalgic lol.
Then there’s also the mood. If you grew up with tv that had captions you’re used to it. But before captions we had terps (interpreters). At live events we have them. At a government press release they already needed one because they can’t just show the teleprompter to the Deaf people in the audience. So they just show the terp where we expect to see them on the screen. Like I can’t think of an event on tv that has interpreters that doesn’t need one in person.
Think about written English: it’s phonetic.
How do you learn to RECOGNIZE A WRITTEN WORD when you don’t know what it sounds like, let alone what the letters mean. Or becomes a matter of a hundred thousand different symbols, recognized as a unit, removed from the auditory context.
I can’t imagine how any deaf person learns to read, to be honest . It’s an astounding feat.
Don’t you just recognize the sequences? There are plenty of non phonic languages, you just recognize patterns instead of sounds.
Which is why we give deaf students extra attention in schools now…
The issue is the deaf community was forced to be insular for most of American history. And part of that included the stereotype of “deaf and dumb” where if a person was deaf, they were assumed to be stupid.
And some older members of that community see the next generation being treated more inclusivly as a negative, because that means their community will shrink if people aren’t forced to only interact with other deaf people. They don’t want integration into the larger community, and they want to force future generations to be segregated as well.
And theyre kind of right. Most of the people with that line of thought aren’t people you’d want to voluntarily associate with. Wanting to hobble the next generation so you don’t feel lonely is pretty low.
Dumb used to mean mute. The phrase meant deaf and unable to speak.
Of course, not being able to communicate leads a lot of people to think someone is stupid, and I imagine that’s why dumb is now synonymous with it.
I once met a lady with some severe disabilities, no idea what, in a powered wheelchair at a bar. She couldn’t talk, and had a massive keyboard she would sort of flail at until she spelled out the words she was trying to say. It audibly spoke for her.
This lady has two college degrees, writes books, and does art to help promote the concept that disabled people are people too.
Pretty damn impressive. Her and her husband’s main gripes were how infantilizing most people are to them. And how expensive good wheelchairs are, lol
Yeah. When people think of Helen Keller they rarely think socialist scholar and cofounder of the ACLU. They think of a little girl being taught to communicate or a manifestation of disability.
I want to give the flip side here. They used to separate us. There was active division based on how bad your hearing was. If you couldn’t hear with effort and hearing aids you were shunted away as a lost cause. But if you could they would tell your parents not to teach you sign language because you’d prefer it. That’s how me, my sister, my mom, and my grandma were all denied our right to a native language that would’ve been easier for us. They didn’t care that by 60 we’d be deaf as a post because our hearing loss was genetic and degenerative. All English all the time, and no acknowledgment that it took effort to hear.
The Deaf community can be insular assholes, but I understand it. Our culture is denied to us. Our language is denied to us. And maybe, just maybe part of why we’re oissed off is because we have some points that nobody wants to acknowledge. Like the fact that cochlear implants aren’t some miracle, they’re great, and my grandparents love theirs, but they’re fucking exhausting to use. Hell, I’m a healthy 28 year old and I have to take my hearing aids out after work because they tire me. And for children born too deaf to use auditory communication with hearing parents, it’s disturbing how few of those parents learn sign language. But every CODA (child of deaf adults) is taught spoken language (and they tend to maintain lifelong ties to Deaf culture)
I’m still mad I wasn’t taught sign as a kid. I’m glad I was given hearing aids but I deserved access to community like me. And if I’d reproduced I would’ve made damn sure my kid was a native signer so that way they’d never grow up in fear of inevitable silence or awkwardly fail to communicate with people who share their disability.
Well, subtitles are usually really fast. For most other things that you dont have to read live, reading a bit slower is not really an issue.
They don’t. Having your native language be easier than another doesn’t mean you’re struggling significantly.
You’ve hit on a problem that the Deaf community faces. There’s often an entire Deaf society in places. Deaf jobs, Deaf schools (including universities), Deaf media… They do read English but it’s harder and it’s not their primary language (though I’ve heard the internet is helping a lot there).
But yeah, there are Deaf universities, including prestigious ones like Gallaudet. Nobody teaches medicine or engineering in sign language from what I can see. I did check and I was pleasantly surprised that Gallaudet offered shit like math, biology, and IT with even grad programs in stuff that isn’t explicitly about deafness.
Would you rather watch content in your native language, or subtitled? If you read translated content, it’s fine. But it’s not the same as hearing something performed for you. Might be hard to grasp if your language is largely auditory and written, rather than visual and emotive.
Just because sign language is a visual language, does not mean reading is an equivalent. There is a ton of nuance and feeling that goes into communicating through sign language that is not possible through text alone.
Beyond the communication piece, there is respect of an individual who natively speaks a language, and the importance of keeping the language alive.
feelitghst
There’s that nuance.
I assumed it was a German word.
… and looked for the 3 words that went into that word.
I mean, there wasn’t enough information to be certain… but live broadcasts of things would have a signer because the live audience would have to bring in screens to add subtitles to the event…
Minored in ASL, this is spot on 👍
Would you rather watch content in your native language, or subtitled?
Subtitled, 100 times out of 10. In fact, that’s what I already do, alongside a significant portion of the non-anglophone world.
But it’s not the same as hearing something performed for you.
Considering the fact that nearly all TV media is made to only be fully enjoyed if you can hear it, that’s a given. Deaf people are missing out either way, though.
There is a ton of nuance and feeling that goes into communicating through sign language that is not possible through text alone.
Just like there’s a ton of nuance that can’t be communicated by text alone when compared to spoken words, you mean?
the importance of keeping the language alive.
This is the only factor you’ve presented I can agree with. Programmes are presented with sign language because it’s important to maintain awareness that it exists. Deaf people are a very small minority, so keeping their languages alive is essential.
Not deaf/HOH, but I’ve watched some signed translations out of curiosity and even to me it seems different. They do things like indicating the feeling of music, matching their facial expressions to the characters’, and sometimes forgoing a direct translation to confer the mood of a phrase.
Even when you’re watching a subbed movie/show, you have the emotion of the voice performance to influence how you read the words. I imagine it’s the same for signed VS subbed translations (to anyone who signs, please correct me if I’m wrong).
That is super interesting, thanks a lot for the detailed comment! I wasn’t aware that sign language is not directly translatable to text as are other languages.
Yeah, but it’s not the translator speaking…
They’re translating spoken words.
They wouldn’t have someone watch the sign language and then translate that into the subtitles, that wouldn’t make any sense logically.
They’d make them off the original spoken words.
So while you’re right there’s be slight difference, those are already being introduced with the sign language, and subtitles maintains the original phrasing and tone.
But can subtitles accentuate the way sign language can?
Spoken word is to text as sign language is to text is my understanding.
I can emphasize a word with sign language that otherwise can’t when just put to a text.
Yes, exclamation points have existed almost as long as written language has…
You also couldn’t do a literal translation of spoken word in sign language. And different people can interpret it differently because of that.
So even if exclamation points didn’t exist, it would still be worth it to keep the wording as accurate as possible.
Im deaf and it’s real weird seeing you get downvoted for saying you can’t translate english to sign language verbatim, cause it’s true. Sign language is a lot more like broken english combined with body language, you don’t word for word translate english to sign, there’s too many words for starters, and lost in translation is a thing that exists.
Part of it is well intentioned people that don’t know any better, and I’m sure you’re aware there’s significant parts of the deaf community who are isolanists and view it as their whole identity rather a small piece of what makes a person who they are. And back 50-60 years ago, that was kind of true.
Even back on Reddit, commonsense is rarely common when talking abouts ASL and especially cochlear implants.
So my take on this comes from having a hearing friend who grew up in a anti-cochlear implant / hearing restoration family (deaf parents, grandparents, her and her sister are the only immediate family with “normal” hearing) and their preference was definitely to drive ASL which probably informs my outsider take on the matter.
Now, I’m not part of the deaf community in any real way (don’t know any deaf people, and only a few who know sign, but I used to know the alphabet), but im badly near-sighted. Like, I didn’t realize that everyone else could see individual leaves on trees instead of vague green blobs where the canopy was. And birds just disappeared into them. The first time I saw a Monet painting, and impressionist art in general, while I still appreciated the beauty my first thought was, “Ah, a painting of the world as seen by a near-sighted person.” That said, I’m very happy to wear glasses and see a truer representation of what the rest of the world sees rather than walking through life in an impressionistic world.
So, for me, I can’t see why anyone would choose to perceive less of the world than they could. If I could further augment my senses in a convenient manner, I would. If my parents had had to choose between some surgery and me being isolated from so much of the world, I’d ask for the surgery if I could.
Translation isn’t a 1 to 1 process. Every language has difference, idioms, etc. My understanding is that sign language is no different.
The translator makes choices to convey meaning, as well as the literal sense.
Im not an expert, I asked my friend.
She is hearing but has deaf parents and grew up with ASL.
I should have said my statement was a regurgitation of someone else’s words, either way you’re also correct.
I have no dog in this argument and my statement should be taken as a “this is what I understand” and an addition to the conversation not a “nah y’all wrong” statement
So…
Your argument for translating this into a different language, is that anytime you translate it, that changes what it says?
Not translating is still best.
And it’s pretty offensive that I’ve already seen comments in here saying deaf people read slower than people who can hear, so hopefully that’s now what you’re about to throw out.
Being deaf doesn’t mean someone can’t read well, that’s a really old stereotype. If a deaf person is a slow reader that’s not because they’re deaf.
I think you may be confused as to who you’re responding to. I’m reading some outrage in your response that is directed towards others and their statements, nothing that I’ve written or believe.
There’s no argument to be made. A (good) translator into another language with take into account the intent of the original language and translate it into a comparative version. That can mean changing stories, or idioms that no longer land in the new language.
I’m not the person who made any claim about reading speeds, and I would disagree wholeheartedly with that baseless statement.
I’m profound deaf. I sign, write and speak. :)
Well, sign language aren’t superior. Having both : subtitles (hard hearing people) and sign language (deaf people) is better. I prefer subtitle because it is closer to the speech and i’m not fond of sign video. Often the sign interpreter is small and sign very quickly.
In general, i prefer text, it help me focusing on the content instead of the person and use less bandwidth…
Sign language still lacks lot vocabulary. It’s a young language «created» in the 18s when Abée de l’Épée founded the first deaf school. And i had to create lot technical signs with sign language interpreters during my agricultural course. Furthermore, they don’t have an official sign writing yet, and it is a problem for keeping human knowledge and culture outside video and technological device. So there is still lot things to do and improve.
In France, lot deaf people aren’t fluent with French writing due to the lack of bilingual school (French writing and French sign language) and interpreters (eg : only 200 hours in sign language for 1 year in universities).
So, having sign language improves a lot the accessibility for deaf people as they are not fluent with writing language. For me, i prefer both. Both are good and it meet each people need. :)
Thanks for sharing!
And thank for your interesting question ! :)
Mind it only reflect my opinion and i do think other deaf people will have a different stance with mine about sign language. :)
In addition to the fact that it’s not just English via hand gestures, I believe it’s done because sign language is speech, with all of the benefits that comes with. There are extra channels of communication present in sign language beyond just the words. There’s equivalents of tone and inflection, and (I beleive) even accents. Like, this video of this lady performing “Fuck You” in ASL is what made it click for me when I first saw it many years ago. She’s just so fucking expressive, in a way that subtitles could never be.
EDIT: changed my wording to be more accurate, since sign language literally is speech through a different medium. There’s no need to draw an unnecessary boundary.
Sign language is speech, it’s just non-verbal speech.
Thanks for pointing this out, I’ve updated my comment to get rid of the unnecessary distinction.
Or this ASL of Rap God
Here is an alternative Piped link(s):
https://piped.video/0iDAkEpCmBs
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
A lot of these comments are American so I thought I would provide a different point of view. In the UK it is a legal requirement for some broadcasters to have a certain percentage of signed programmes.
To add to this, repeats with added sign language were (are?) often broadcast late at night because you were meant to set your video to record them to use as teaching materials. wasn’t just sign language, a lot of the videos shown in school was stuff that had been taped from 3am
But why is there such a legal requirement?
To add another part of it, for people using BSL, it’s akin to their mother tongue.
Being able to watch content with signing is akin to having a TV show dubbed into your native language, rather than relying on subtitles.Edit: I just had a check, and it’s actually mentioned in the ofcom guidelines:
Subtitle users reflect the full range of proficiency in English; some profoundly deaf people regard BSL as their first language, and are less fluent in English.The UK’s always been pretty inclusive and this law’s been around for decades, since way before subtitles were practical, or even visible on crappy old b&w CRT screens
So it’s just because they haven’t bothered updating some guideline booklet about new technologies?
Nobody has gone like: BTW this new thing called subtitles, could actually replace sign language requirements especially now that we have color TVs
That said, I can imagine sign language to be better at real time interpretation, than someone typing in the speach unless they use some really good transcription software
I’ll just reply on this one too, we have fairly detailed recomendations and guidelines on access services in the UK. If you’re curious, it’s summarised really well in this document (10 pages).
Live subtitles always used to be done using a stenograph, or similar, though having a look now speech-to-text seems more common. As I happened upon it too, here is a cool white paper by BBC R&D on inserting a longer delay in live events to allow the subtitles to follow more closely.
Most of my girlfriend’s family is deaf. They read fairly slowly and end up usually not really following subtitles very easily. Sign language is fastest for them to understand.
I’ve heard that because written English is phonetic - meaning it shows how the sounds are (approximately) - then for people who have always been deaf that doesn’t make the same sense, and reading words is a bit like reading a bunch of telephone numbers and remembering what they mean.
I.e. the same as a programming language, which can be easily learned to be read at astounding speed… Also, written English is one of the least phonetic languages you could possibly find.
Not really. You can still sound out the phonemes in a programming language. Perhaps if the whole thing were perl memes. And while I agree English orthography is a mess, for “not phonetic” it holds no candle to Chinese.
Maybe Chinese is a better comparison, I hadn’t thought of that.
If sign language is your first language, any written language is like a foreign language that you might’ve learned but aren’t a native speaker in.
ASL (or whichever sign language) is NOT a direct visual translation of English or French or Mandarin or whatever. It’s a totally different language and the written language is a second language. People might be highly proficient at reading and writing English in an English speaking country but it’s a different language.
And incredibly regional as well.
Any isolated language with a small local population is going to differentiate quickly, and while the Internet is bringing everyone together and making written language more consistent, it’s not like deaf people send each other videos online, they just use written English because it’s insanely easier and faster for everyone.
I knew a 3/4 deaf girl who had learned ASL, who had a bf who was fully deaf from birth… he did send her videos of himself signing.
Sure, lots of couples send videos to each other.
But I doubt that was their sole method of conversation
I meant more of general conversation, like instead of scrolling comments on here, we had to watch a video of what everyone say without knowing where it was going.
Completely impractical
Well, it’s not like being handicaped is practical.
People with disability have to deal with impractical situation all the time bc what is practical for able people is just not faisable or extremely unpractical for them, and society is far from being inclusive.
A decent amount of deaf people don’t speak English so wouldn’t be using written English. Schools that teach both are actually called dual language schools
Deaf people that can’t hear at all, still read and write, please stop speaking for a lifestyle you don’t know anything about.
😂 I’m deaf you numpty. There’s entire deaf communities that don’t read or write english. It’s actually a hotly debated topic as some think kids shouldn’t be forced to learn both.
Only in 'murica (and the anglosphere) could people think that learning more languages could possibly be a bad thing…
I’m not American but it’s suggested that learning a sign language and a ‘spoken’ language at the same time can slow the acquisition of both.
We see it in kids with two ‘spoken’ languages too but I believe to a lesser extent.
If I had a deaf kid I would teach them both but I understand the choices of parents that don’t do that.
And they’d have no idea what ASL was…
So what’s your point?
Not even every English speaking country uses ASL, and it’s different in different regions even in America.
You’ve got confused. A lot of deaf people speak ASL, BSL, AUSLAN etc exclusively. They don’t speak English. Speaking both is bilingual.
Who’s talking about speaking?
We’re talking about reading/writing/typing…
I would be insanely surprised if someone used American/British/Australian sign language yet didn’t know a single written language. Especially not the one used in their geographical location
This explains why a fair amount of deaf people don’t use written language.
Who’s talking about speaking?
We’re talking about reading/writing/typing…
I would be insanely surprised if someone used American/British/Australian sign language yet didn’t know a single written language
I didn’t know about this before but I does make a lot of sense :)
i am not Deaf, but i imagine it is easier having stuff presented in your native language.
It’s also simpler, faster, and more accurate to have a live translator than having some one type.
For one thing there are probably people who know sign language but can not read.
Because it’s some people’s native language and for those people English (or whatever is the spoken/written language of the land) is their second language. Sign languages aren’t using hands to communicate in their original language, those do exist like ESL (English as a Signed Language) but the Deaf in America and England don’t use ESL, we use ASL (American Sign Language) and BSL (British Sign Language) respectively. These are very different languages from each other and ESL. They don’t even share fingerspelling alphabets.
Captions are amazing for the hard of hearing and late deafened, especially since many children such as myself who grew up hard of hearing were denied sign. But it’s my language by right and I was denied it as a native language. It’s natural for face to face communication in a way writing isn’t and it’s also a cultural language. A Deaf five year old can understand the news broadcast in sign language just as well as a hearing one can understand the spoken one.
I have only ever seen this at live events and so the persons actually there would not be able to see subtitles.
Beides being more natural to follow for native Sign “speakers” (do you say Sneakers? No idea), at live broadcasts it is way more efficient than live subtitling
ASL has very different structure to spoken/written English, so not everybody who signs is going to comprehend English grammar as fluently/easily or the nuance of all the words that don’t have a sign equivalent.
Additionally ASL communicated who is talking and the tone of their words, even when the speaker is off screen, which just can’t be captured by captioning. Closed captioning has just caught on to using slightly different colors to indicate the speaker, so you know who’s talking offscreen. I’ve only seen this in British panel shows so far but it’s helpful.