2

‘Hey, GitHub!’ will let programmers code with just their voice

 1 year ago
source link: https://www.theverge.com/2022/11/9/23449175/hey-github-voice-copilot-code-programming-system
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

‘Hey, GitHub!’ will let programmers code with just their voice

/

GitHub is experimenting with a new way to let programmers create code with their voice inside Copilot.

Nov 9, 2022, 5:35 PM UTC|

Share this story

GitHub’s Copilot system
Image: GitHub

Microsoft-owned GitHub is experimenting with a new voice-based interaction system for its Copilot software. “Hey, GitHub!” will allow programmers to code with just their voice and no keyboard, just like how you’d speak to Siri, Alexa, or Google Assistant.

The new experiment will be available in Copilot, a $10 per month AI tool that GitHub launched earlier this year to help developers write code. Copilot suggests lines of code to developers inside their code editor, and it’s capable of suggesting the next line of code as developers type in an integrated development environment (IDE) like Visual Studio Code, Neovim, and JetBrains IDEs. Copilot can even suggest complete methods and complex algorithms alongside boilerplate code and assistance with unit testing.

“With the power of your voice, we’re excited about the potential to bring the benefits of GitHub Copilot to even more developers, including developers who have difficulty typing using their hands,” explains GitHub in a blog post today. “‘Hey, GitHub!’ only reduces the need for a keyboard when coding within VS Code for now, but we hope to expand its capabilities through further research and testing.”

The addition of voice-powered code creation will be particularly helpful for accessibility scenarios. You’ll be able to ask Copilot to do things like move to different lines of code, or navigate to methods or blocks with just your voice. You can even control Visual Studio Code, with commands like “run the program” or “toggle zen mode.” If you want a summary of what a chunk of code does, you can even ask for a code summarization. 

This new voice system is being developed by GitHub Next, a team of researchers and engineers that “investigates the future of software development.” There’s no guarantee it will eventually launch as a full product, but the experiment certainly feels like an easy way of combining transcription, which is effectively white-label AI, into GitHub’s Copilot service. You can sign up to join the wait list for Hey, GitHub! right here.

GitHub is also planning to let businesses purchase and manage seat licenses for GitHub Copilot. This will include admin controls for Copilot that can manage the various settings across an organization. GitHub is opening up a wait list for businesses to sign up here.

While GitHub continues to bolster its Copilot service with new features, the software has also been targeted with a proposed class-action lawsuit. The lawsuit accuses Microsoft, GitHub, and OpenAI of facilitating “software piracy on an unprecedented scale” by scraping copyrighted material from the web to train Copilot, which reproduces the code without proper attribution. If the lawsuit is granted class-action status, it could upend the defense that such data collection is covered in the US by fair use doctrine, potentially affecting not only the legality of Copilot but also a whole range of generative AI models.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK