Microsoft introduces its new Teams tool that enhances the sign language user experience towards building efficiency and community.
Microsoft is pleased to announce Sign Language View, a new meeting experience in Microsoft Teams that helps sign language communicators—those who are deaf or hard of hearing, interpreters, and others—put themselves center stage, always in the same place, in all meetings.
“As a deaf person who attends meetings in Teams multiple times a day, I am well aware of the challenges that virtual meetings impose on users who are deaf or hard of hearing (S/HD). I face them too. As Microsoft Teams Meetings, Calling and Device Accessibility Architect, one of my main responsibilities has been to expand the vision to create a best-in-class experience for the S/DA community in Teams,” said Chris Sano, Accessibility Architect, Microsoft Teams.
In order to learn, Microsoft has talked to many S/DA users, listened to their feedback, and mapped out a roadmap for creators and users. The sign language view is a first step in addressing several requests from the S/DA community, including:
• Keep the video feed of interpreters and other sign language users in the same location.
• Ensure that video signals are of the proper shape and size so that sign language is visible.
• Allow participants to have two other sign language users in view during the meeting, and
• Reduce repetitive meeting setup tasks like pinning interpreters and turning on closed captioning at the start of each meeting.
When Sign Language view is activated, priority video streams will automatically appear in the appropriate aspect ratio and highest quality available. Like anchoring and closed captioning, the sign language view is personal and won’t affect what others see in the meeting. In addition, it adapts to all needs: can be activated instantly in a meeting or set as a permanent setting for all calls.
When sign language view is activated, video cues from designated people remain visible center stage as long as they have their video turned on. Other participants can also be anchored or highlighted, without invading the space of the sign language interpreter.
Also, when someone shares content in the meeting, the prioritized sign language user changes position, but remains visible in high quality and at a larger size than the other participants’ video feeds. The sign language view can be turned on by default for all meetings, as well as identify the sign language speakers you regularly work with within your organization.
The sign language view and accessibility settings window are currently in public preview and will begin rolling out to Teams desktop and web clients, and commercial and GCC clients in the coming weeks. For detailed instructions on how to activate it, see the public preview in Microsoft Teams from Microsoft Learn.
These features are just the beginning – one step on a much longer journey. Microsoft is committed to creating a meeting experience in Teams that is not only accessible, but also enjoyable, for users who are deaf or hard of hearing. Users can provide feedback via the Help menu within Teams.