12

Music-Tech Startup Moises Raises Oversubscribed Seed Round of $1.6 Million

 3 years ago
source link: https://venturebeat.com/2021/08/03/music-tech-startup-moises-raises-oversubscribed-seed-round-of-1-6-million/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client
Music-Tech Startup Moises Raises Oversubscribed Seed Round of $1.6 Million
Press Release

Music-Tech Startup Moises Raises Oversubscribed Seed Round of $1.6 Million

Data-driven creative for sensitive verticals

With big privacy changes, creative has become even more important with verticals like health and wellness and finance. Learn how to make data the backbone of your campaigns.

Register for free now
ADVERTISEMENT

All the sessions from Transform 2021 are available on-demand now. Watch now.






Artificial Intelligence (AI) Platform Celebrated by Musicians, Creative Professionals for Solving Decades-Old Problem & Democratizing Technology for the Masses

AI+ Synthetic data = Smarter Robots 1


SALT LAKE CITY–(BUSINESS WIRE)–August 3, 2021–

Today, music-tech startup Moises announced it has raised an oversubscribed seed financing round of $1.6 million, led by investments by Utah-based fund Kickstart Fund, with participation from Valutia, Verve Capital, and others.

While already profitable and financially self-sustaining after less than two years, the company sought additional funding to focus on growth and marketing to support its goal of leading the future of artificial intelligence (AI) in the music industry.

Since its launch in 2019, the company has grown from 50,000 registered users to 4 million users today, including +1.5 million monthly active users across 210 countries. It recently ranked #2 in the U.S. Apple App Store for Musicand #1 in Japan – the two largest markets for the music industry.

Moises is a music-tech platform that democratizes access to cutting-edge AI audio technologies and empowers music creators to achieve their full creative potential.

“I started Moises to solve a decades-old problem: give music creators the ability to split vocals from instruments and other musical elements in a simple and fast way, so they can create, practice, produce, teach, and perform music anywhere,” said Geraldo Ramos, cofounder and CEO.

“As an amateur drummer and tech entrepreneur, I’m honored to see that our product is resonating with funders and music lovers alike. We can’t wait to show the world what’s next as we push the boundaries of artificial intelligence in the music industry.”

“We’re thrilled to be a key investment partner in Moises as it leads the way in AI, music and tech,” said Sydnie Keddington, Kickstart Associate. “It’s exciting to support a home grown company with tremendous global reach, and we look forward to continuing our partnership with Geraldo and his team to help the company expand further.”

Moises is the second U.S. company created by Ramos, who previously founded the online technology mentoring platform HackHands, which was later acquired by Pluralsight after less than two years. He also founded several companies in his home country of Brazil.

The company is based in Salt Lake City with staff in the Brazilian cities of São Paulo, Rio de Janeiro, Recife, Foz do Iguaçu, and João Pessoa.

About Moises

Moises is a music-tech platform offering musicians and producers a suite of artificial intelligence (AI) tools such as audio source separation, pitch/beats/chord detection, metronome, tempo changer, and mastering.

Founded by tech entrepreneurs Geraldo Ramos, Eddie Hsu and Jardson Almeida, the mission of Moises is to empower artists to achieve their full creative potential by democratizing access to cutting-edge audio technologies.

Since its launch in 2019, the company has grown from 50,000 registered users to more than +4 million users today, including +1.5 million active users across 210 countries. The mobile app, which is available in 21 languages, has reached the top 10 charts (Apple App Store & Google Play) in over 40 countries (including the U.S.) for both iOS and Android and was recently featured in the Google Play “Apps we love” editorial.



ADVERTISEMENT

View source version on businesswire.com: https://www.businesswire.com/news/home/20210803005291/en/

Jake McCook[email protected]

VentureBeat

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

Become a member

Sponsored

Qualcomm Technologies VP talks about AI edge solutions that are galvanizing the market

VB StaffJuly 07, 2021 08:20 AM
Mike-Vildibill-Headshot1-1.jpg?fit=930%2C570&strip=all

Data-driven creative for sensitive verticals

With big privacy changes, creative has become even more important with verticals like health and wellness and finance. Learn how to make data the backbone of your campaigns.

Register for free now

Presented by Qualcomm


Edge computing is accelerating and is already starting to reshape the future of the datacenter, 5G edge computing, and 5G infrastructure. It’s transforming industries, business models, and experiences, connecting unconnected things, and making it possible to reimagine how the world works, plays, and lives.

In this VB Q+A with Mike Vildibill, the new VP of Product Management at Qualcomm Technologies, he talks about the explosive demand for AI inferencing in the cloud, innovative cloud-to-edge AI solutions that are galvanizing the market, and his vision for Qualcomm Technologies in his new role.

VB: What’s exciting you most in the AI cloud space right now?

MV: During my career, we’ve seen phenomenal movement of processing from local on-prem at the enterprise into the cloud. We’ve seen the implications of this not only to the IT industry, but to how we compute, how we manage data, how we manage our personal data, and how we communicate with each other. And now there’s been another massive shift. We’re watching processing migrate from the cloud to the edge. Computing now has to happen closer to the action, to where the people and devices are at the edge of the cloud, not back in a data center in another part of the continent. This is again having profound implications. Things we see in our daily lives.

By doing a lot more immediate computation on that data closer to us, it means lower latencies or faster access, and it also means more personalized experience, more performance on computing. It can be done at low power near to you. This general migration to the edge translates into the fact that intelligence is moving toward the edge, and all that goes along with it.

VB: What would you say is the biggest news in this generation of chips that’s facilitating everything you just talked about?

MV: To being with, I believe that 5G is a game changer. It’s allowing for very low latency and high bandwidth communications to anywhere, to remote locations. You no longer need it to be next to a data center, or to have a big expensive wiring closet full of telco gear. It can be connected to the internet via a 5G-enabled mobile device, for example, and enabling big use cases across industries, like smart security cameras in stores that can be used also for stock tracking, off-limits areas, and theft alerts. Airport monitoring cameras, to track loitering, lost children, off-limits areas and suspicious packages. Even automatic payment at the fast-food drive through line, including unpaid alerts. So much more.

But you also need power efficiency. That comes down to high performance, and you need to be able to do it at very power-efficient means or ways. That’s where Qualcomm comes into the story, with the Cloud AI 100 product line, for example. It’s extreme power efficiency and extreme performance, and this allows for deployment in ways and in places that otherwise simply couldn’t be conceived. For the first time, we have a single product and a single software tool chain that can span that extreme expanse, providing multiple orders of magnitude in performance with one tool chain, one set of tools, one interface. Cloud AI 100 is doing that.

VB: What do you see as a business leader’s biggest challenge right now as the technology leaps ahead?

MV: I talk with my counterparts in other companies often, including the startups. It’s very clear that the market is moving very fast, the market dynamics are evolving very quickly, and requirements are changing. The whole AI and inference market is still rather nascent. It’s still going through a lot of growth. You need to keep your eye on the research, what’s being done in academic environments, to see what the next big disruption is. But I guarantee there will be a disruption, because it’s so young in its life cycle. It’s not yet gotten to a steady state. As a business leader, making sure that we’re prepared for disruption and that we can not only manage through it, but actually take advantage of disruption when it happens, that’s the exciting part.

VB: How does an all-in-one product simplify tasks for IT business leaders and those who work with them?

MV: Let me use the following example. Think of yourself as a systems integrator or an OEM. Your job is to support your enterprise customers. Without naming names, I can think of enterprise companies that sell furniture directly to the public. They have good meatballs, by the way, at warehouse scale. I’m using that as an example as a type of company who needs a lot of computer vision. They want to watch what’s going on in the store. Everything from people to safety to inventory in the store. There’s a phenomenal amount of stuff that goes on in a store like that. But that same company has their own supply chain, their own distribution chain, their own business logic that goes on in the background, that does happen in the cloud, or in an on-prem — they have mainframes and big supercomputers crunching a lot of numbers in a central location as well.

As you can imagine, a company like that, where they do AI in their data center, doing recommended workloads, doing analytics, doing language processing — they have a website, so they’re doing NLP with people who are trying to talk to the chatbots on the website. But that same company is putting these low-power computer vision appliances in their store. From their perspective, it’s a continuum. There’s no natural divide between what’s in the store and what’s in the data center. There are too many shades of gray in between. Some of them put a small data center in the store. Some of them put the store logic in their data center. It’s not segmented.

As soon as they can have a single partner, a single tool chain, a single set of tools to debug, optimize, and scale, they’re happy. They really want to have a single architecture from soup to nuts. What Qualcomm is bringing to the table is our strategy and our belief in open frameworks. What our customers are telling us very clearly is that it also allows them to integrate, for example, what we do with other IT infrastructure that’s also holding to open industry standards and open-source models with their other IT environments very easily.

Some vendors in the AI space are driving a strategy of a walled garden, which makes it that much more difficult to easily and seamlessly interoperate with IT equipment that doesn’t come from that walled garden vendor. Directly answering your question, there’s a real advantage for soup to nuts, and even greater advantage in doing so in an open framework, an open APIs model.

VB: What is Qualcomm’s vision for the distributed intelligence space?

MV: While Qualcomm Cloud AI solutions are playing their role on the edge-cloud and server side — inferencing large scale deep neural networks — there are also AI running purely on your device, which is called on-device AI.  For example, our Qualcomm Snapdragon 888 5G Mobile Platform has a very powerful 6th generation Qualcomm AI Engine.

The number of neural networks that are running on any given smartphone today is staggering and ultra-secured, meaning it’s all processed locally without leaving your device. If you combine our on-device AI technology with cloud AI processing, and then throw in the speed of our 5G solutions as the link, you’ll have the basis of Distributed Intelligence — where AI is distributed across channels to power new experiences and Qualcomm is an undisputed leader in all of these spaces.

VB: Where does the new Cloud AI 100 go from here? What is the future?

MV: We’re going to make the investments to ensure that we maintain the leadership position on power efficiency. We believe this unlocks much of what we’ve talked about today. As soon as you’re power efficient, which means you need extreme performance and extreme efficiency, and you put the two together and you have an industry-leading power/performance metric, that’s going to be something you’ll expect to see Qualcomm continue to push. That’s in our wheelhouse. It’s what we do as a company in the mobile markets and other markets. It’s what we need in order to unlock new segments. We believe it’s fundamental to all these new use cases, even in autonomous driving. We need exactly this. Extreme performance and extreme efficiency. First and foremost, that’s it.

We’ll continue to expand into adjacent markets. We’ll continue to improve and invest in software and SDKs and tool chains. We’ll continue to support open source and embrace open-source frameworks. We’re going to continue the playbook you see from us today. Expect more from Qualcomm around power efficiency going forward.


Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact [email protected].


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK