For the first time, new technology will allow professional-grade photorealistic avatars to be created using only a smartphone.
ZURICH, May 18, 2023 /PRNewswire/ — COPRESENCE AG (“copresence”), a provider of digital avatar generation software designed to improve digital communication, has today emerged from stealth to announce the closed beta launch of its platform and app. The first-of-a-kind platform is designed to create a true digital likeness in minutes, for use across gaming, virtual/augmented reality (VR/AR), and video conferencing.
The copresence platform allows for the creation of a photorealistic 3D model or “digital twin” of a user’s face by utilizing AI-powered scanning technology that is compatible with any smartphone. This digital twin can then be used on any platform, including PC, smartphone, or VR headsets. For gaming, this will allow developers to build experiences where users can play as virtual versions of themselves on any game engine, including Unity and Unreal Engine.
Featuring powerful AI-driven tracking technology that translates real facial expressions and non-verbal cues from a user to their avatar in real-time, the platform is perfect for providing a more immersive video conferencing experience. By placing video call attendees in the same virtual space, copresence’s software allows users to direct their gaze towards other meeting participants and see who is looking back at them thanks to the platform’s smart eye-contact cues.
With copresence, users can do away with cartoonish avatars and replace them with a photorealistic digital version of themselves, thanks to the platform’s high levels of photorealism and convincing real-time responsiveness.
“Unlike other 3D character generation solutions available today, copresence’s technology is able to create photorealistic avatars that are of the best quality. What’s more, this is all done in less than three minutes,” said Radek Mackowiak, copresence CEO. “Thanks to our proprietary tracking technology, users are able to precisely control the facial expressions of their digital twins, making for the most true-to-life avatar experience available, whether across gaming, VR/AR or for video conferencing.”
The closed beta of copresence’s platform marks the company’s first official announcement as it publicly launches, having been under private development since March 2022. In the year since its inception, copresence has assembled a team of industry-leading machine learning and computer vision experts to conduct incredible amounts of fundamental research to design the proprietary technology.
After several iterations and prototypes, the resulting product has already garnered the attention of Fortune 500 companies, along with several AAA gaming studios. With the newly unveiled closed beta, copresence will now look to gather feedback before the open beta later this year, and eventual full release of its platform in the future, which will include an API release as well as a smartphone app.
To date, Copresence has raised $2.75 million in seed funding. The company is planning to launch the open beta version of its industry-leading avatar generation solution in Q3 2023. For developers interested in signing up to copresence’s closed beta program, visit https://copresence.tech/.
This press release may contain forward-looking statements subject to risks and uncertainties that may cause actual results, performance, or achievements to differ materially from those expressed or implied. These forward-looking statements speak only as of the date of this press release, and copresence undertakes no obligation to update or revise these statements.
About Copresence:
Founded in 2021, copresence offers a digital avatar solution for the connected world. The Swiss-tech company’s powerful AI and software platform lets users generate a 3D version of themselves using just a smartphone, for use virtually everywhere. The company’s industry-leading technology offers the gaming, VR/AR and video conferencing industries with the perfect solution for photorealistic 3D avatar creation. Find out more at copresence.tech.
SOURCE Copresence