Samsung Electronics hosts "AI for All: Connectivity in the Age of AI" Press Conference on January 8, 2024
Samsung Electronics hosts a press conference on January 8, 2024, a day before the opening of CES 2024, the world's largest information technology (IT) consumer electronics exposition. Main theme of the event is "AI for All: Connectivity in the Age of AI".
Jong Hee Han, Vice Chairman and CEO (DX Division Head) at Samsung Electronics, will headline and unveil the AI strategies from Samsung Electronics. The press conference will be streamed live online on the Samsung Electronics Newsroom. Learn more at the Samsung Electronics Newsroom.
|
|
|
|
Tutorial: Developing a Matter Virtual Device on the SmartThings Platform
Samsung SmartThings provides the Matter virtual device application to help you develop devices that support Matter, an IP-based connectivity standard for smart home and Internet of Things (IoT) devices.
When Matter was first introduced, platform companies faced challenges in testing and optimizing Matter on their own devices. To address this issue, SmartThings developed a Matter virtual device application which can be used to test and optimize various types of devices, including ones that have not yet been released. The Matter virtual device application is available through the Matter open source project.
This tutorial demonstrates how to create a virtual device using the Matter virtual device application. Read the tutorial at the Samsung Developer Portal.
Learn more |
|
|
|
Tutorial: Developing Matter IoT Applications Using SmartThings Home API
Samsung SmartThings unveiled the SmartThings Home API at the Samsung Developer Conference 2023.
SmartThings has been building an open ecosystem. The SmartThings Home API enables our partners to use the SmartThings' cloud, which means applications can support Matter devices connected to any of the numerous SmartThings hubs worldwide. The applications can also integrate to video and music streaming or healthcare services to provide a rich user experience.
This tutorial demonstrates how to use the SmartThings Home API to create an IoT application. Learn more and read this tutorial at the Samsung Developer Portal.
Learn more |
|
|
|
Expanding the Smart Home Universe with "Hub Everywhere"
The latest tools and features for SmartThings were introduced at the Samsung Developer Conference 2023. SmartThings makes it easier for everyone to start and grow their smart home with the support of third-party Matter bridges. Matter bridges serve as connectors that seamlessly integrate non-Matter devices with Matter ecosystems and ensure an effortless setup for users.
The support of Matter bridges has now been expanded to the “Works With SmartThings” program, offering more opportunities to connect with millions of Samsung users. Learn more at the SmartThings blog.
|
|
|
|
Tutorial: Testing Watch Face Studio on Remote Test Lab
Did you know you can use the Remote Test Lab (RTL) to test the watch faces you designed with the Watch Face Studio (WFS)? Remote Test Lab enables you to test your applications on actual Samsung Galaxy devices without having physical access to them to ensure your watch faces function correctly on all the latest Galaxy Watches.
This tutorial guides you through connecting the Watch Face Studio to the Remote Test Lab, deploying your watch faces on the remote Samsung Galaxy Watch devices, and testing various features on them, including the complication slots on the RTL Galaxy Watch devices. Learn more and read the tutorial at the Samsung Developer Portal.
|
|
|
|
Tutorial: Connecting Galaxy Watch to Watch Face Studio over Wi-Fi
Watch Face Studio (WFS) is a powerful tool for designing a watch face for Galaxy Watches. It also offers an easy way to test and deploy your watch face design on a real device.
This tutorial shows you how to connect a Wear OS-based Galaxy Watch to the Watch Face Studio over Wi-Fi and then deploy and test a watch face project on the watch. You will also learn the Wi-Fi connection process using an Android Debug Bridge (ADB) shell command. Read the tutorial at the Samsung Developer Portal.
|
|
|
|
Stay Ahead: Galaxy Emulator Skins for Tab S9 Series and S23 FE Released
The latest Galaxy Tab S9 series and Galaxy S23 FE emulator skins are now available. The Galaxy emulator skins allow you to test your application on Android virtual devices with the look and feel of these latest flagship phones. Download the Galaxy emulator skins from the Samsung Developer Portal, and upgrade your application to the next level!
Learn more |
|
|
|
Evaluating Robustness of Language Understanding Models
In a spoken dialogue system, a Natural Language Understanding (NLU) model is preceded by a speech recognition system. Speech recognition errors may deteriorate the performance of the NLU model. Engineers from the Samsung R&D Institute Poland in collaboration with scientists from Adam Mickiewicz University in Poznań developed a new method for evaluating the robustness of natural language understanding models to speech recognition errors.
The proposed method does not rely on the availability of spoken corpora, and does not require any additional annotation of data. This means the dataset used for training and testing the NLU model can be repurposed for cost-effective robustness assessment. Learn more about the method at the Samsung Research blog.
Learn more |
|
|
|
Auto Off-Target: a New Scalable Way of Testing Complex Software Systems
Complex software systems that power OS kernels, firmware, basebands, IoT devices, and automobiles serve as the foundation of an infrastructure that billions of people use every day. Thorough testing of these systems is crucial, but they often cannot be tested on target devices with typical software testing techniques and tools. A promising approach for this is off-target (OT) testing, where part of the code is extracted and adapted to run on a different hardware platform. Unfortunately this technique does not scale well, as the process of creating an OT program is manual.
A solution for OT testing scalability is Auto Off-Target (AoT), a new way to test complex software systems. AoT can automatically create off-target programs in C using information extracted from the source code and the build process, and it also provides mechanisms for troubleshooting missing program status problems in the OT code. Learn more about AoT at the Samsung Research blog.
Learn more |
|
|
|
A Model for Every User and Budget: Label-Free and Personalized Mixed-Precision Quantization
Research in Automatic Speech Recognition (ASR) shows that larger models yield better results. State-of-the-art networks continue to grow, as they must be able to process a wide range of accents, languages, and vocal styles with high precision.
However, different devices offer different memory constraints. A larger, more sophisticated model could be used on a mobile phone, whereas a watch could host a more compressed version. In a distributed setting only the speech of a single user needs to be transcribed, so a compressed model would suffice.
This study introduces the myQASR method, a personalized compression system where only unlabeled samples of target users are needed to optimize the performance for a small group of vocal characteristics and a wide range of device sizes. Learn more about myQASR at the Samsung Research blog.
Learn more |
|
|
|
|
|
|
|
|
|
|
|