- Pre-trained inferences
- Model conversions
- Object, Face detection
- Deep learning algorithms
- AI/ML Accelerators, NPU
- Neural Compute Modules
- Product-ready IO boards
- Custom specific IO boards
Open Source Tools
- Hardware enablements
- Mainline Kernels
- Display, multimedia tools
- Security patching
- Package management
- Model validations
- Deploy management
- Monitor operations
OpenAIA v1.0 :
The first version of OpenAIA is working under progress, our team is trying to deliver state-of-the-art pre-trained AI models that are ready to deploy and develop end-user applications on 2 and 6TOPS accelerators.
Pre-trained modelings perform better and reduce training time
Inference at Edge drives security, fast responsive-ness
Open Source tools are fail-safe code and licence free.
Delivers model conversion and deployments 3x faster and secure
Like Software AI, Data is an essential element even for Hardware AI. Setting up the right data by creating properly aligned data sets, annotations, and labeling makes deep learning produce the most accurate AI.
Edgeble has a set of defined rules to get data sources from consumer, industrial, and automotive markets. The data generation, annotation, and labeling are predominantly done by working with leading AI adopters or partners in the AI industry. Edgeble is already partnering with T-AIM which is one of the governing ecosystems for getting the right AI data.
Pre-training is the most critical and essential task in order to produce fast and accurate AI for any application where the pool of models is trained so that the customers can develop the AI application from the model library.
Pre-training in Hardware AI is not easy. The pre-trained models are mostly user-defined model APIs that require a model conversion for running them on Inference Edge. Edgeble has a specific set of tools for converting and training these models. Some of the pre-trained models run on Edgeble first generation neural compute modules are - object detection, Human pose detection, and Vehicle ADAS.
AI Accelerators (Neural Processing Units, NPU) are high-performance parallel computing deep learning inferences run in the form of hardware.
Edgeble Neural Compute Modules (NCM) are hardware AI compute modules designed to accelerate the deep learning inferences at the edge by reducing the latency and increasing the responsiveness. NCM is built on top of high-performance NPUs with a holistic computing system that has the capability to process high-rate video and graphics by operating -40°C to +85°C temperatures with 2TOPS, and 6TOPS speeds.
The better the software the better the hardware to perform, this is not new even for AI Accelerators.
Adopting a community-based approach to develop open source software will gains customers to change and optimize the code with hassle-free license issues. Edgeble has team of experts who believe community development cycle will improve and maintain the AI hardware in a longer run by inspiring the customers to use and benefiting from it.
Embedded system deployment is hard to design due to the fact that it involves various hardware components and fewer footprint memories.
Edgeble OpenAIA modernizes the way of deploying Embedded AI hardware at the Edge in a robust, fast, fail-safe, and secure environment. The Platform is independent of underlying hardware with the integration of several software tools and hardware modules that can change and upgrade over time for improving time-to-market.
OpenAIA at the Edge
OpenAIA makes all possible Edge AI-enabled solutions robust, fast, fail-safe, and secure, here are recently validated use cases.
Pre-trained models for OpenAIA can identify multiple objects and their location in an image or real-time stream video.
OpenAIA platform with Edgeble Neural Compute Modules we can run an object detection model directly on your device over 100 frames per second with 8K code of Six CSI2 cameras capture simultaneously with a neural speed of 6TOPS.
Object detection in real-time traffic signals is difficult as the objects in the geo-referencing locations are always moving and the reference datasets are difficult to annotate.
OpenAIA has some reference pre-trained models based on the generic traffic data so the customers can easily integrate and customize the models for their applications. Edgeble NCMs are as powerful enough to capture traffic data irrespective of environmental conditions.
AI in construction fields is challenging due to its inspection, assessment, and maintenance of civil infrastructures and construction elements.
Edgeble offers industrial-grade NCMs that are capable to compute complex AI computations like construction elements work statistics, worker's movement, fire alarm signals, and many more with operating temperatures of -40°C to +85°C.
Advanced driver assistance systems (ADAS) matter. Over 1.2 million people are killed in road traffic accidents globally every year due to human error.
OpenAIA Platform has some trained models to compute ADAS for vehicles with the help of Automotive grade NCMs offered by Edgeble. Most of these models verified in lane detection, speed monitoring, and other ADAS features.
Edgeble OpenAIA is an AI Accelerator toolkit for training, accelerating, develop and deploying high performance AI Accelerators at the Edge.
A pool of pre-trained models are trained and converted so-that the AI Accelerator enabled Edge devices can pre-use it to build the applications fast.
OpenAIA v1.0 is delivers the toolkit for Rockchip-based AI Accelerator enabled System On Chips (SoCs) like RV1126, RK3588
Eventually, but the current version toolkit is supporting Rockchip NPUs. What is Edgeble Neural Compute Module?
Neural Compute Module (NCM) is a compute module that has built-in computing architecture with builtin Hardware AI blocks like Neural Engine and Camera Sensor interfaces.
At present - 2TOPS, 6TOPS
Yes, One of the essential task of OpenAIA is to enable bounded hardware with Open Source Linux kernel and other software tools.
Maker version of OpenAIA is free to use, but the production and enterprise need to be licensing.