AI Capabilities Platform

With Orwell underlying algorithm capabilities, WellSpiking one-stop AI development platform,

and WellData integrated data platform, we form a solid AI base to support all Westwell services.

Support quantification, pruning, sparseness, and training optimization of most custom algorithm models to maximize the model accuracy
Support mainstream open-source deep learning computing frameworks, which allow users to integrate their own frameworks easily
Provides a series of hardware deployment, calling, debugging, and monitoring tools, which helps users develop products and maintain projects
Provides various types of open-source and general task algorithm libraries, which can be trained directly and deployed on hardware platforms
Provides hardware implementation, optimization, export, and simulation tools for a series of algorithm models, which allows users to optimize hardware algorithms independently and effectively isolate service models and data
Orwell VCU V100
Decoding Unit
Dedicated image decoding card. Embedded with a dedicated H.264/H.265 video stream encoding and decoding unit, it supports decoding, image revision, and other operations across 16 video streams to meet different AI algorithms' requirements for input images.
Orwell PAU V200
Acceleration Unit
Dedicated AI acceleration card. Embedded with a HardRock AI acceleration unit, it supports various deep learning operators, flexibly orchestrates deep learning networks, and supports various AI models such as target detection, image recognition, and semantic segmentation. Features a PCIe Gen3x8 interface, has a power consumption of 30 W, and can work in a temperature range of -20°C to +70°C.
Edge Acceleration Unit
The Orwell edge inference acceleration unit is a heterogeneous computing system based on Intel CPU and FPGA. It supports real-time encoding, decoding, and inference acceleration across 16 cameras and supports Int8/16 quantified deep learning operations.
Industrial-Grade Dual-lens Stereoscopic Camera
Meets the requirements of 3D measurement, target identification, distance measurement, and image segmentation and has been verified in Westwell application scenarios.

Millisecond-level imaging

Wide dynamic range

Super strong processing power,

40 m line-of-sight

IP67 rated

Low-light imaging

Full-Process Management

Integrate data collection and processing, model training, testing, and deployment functions to quickly build all required AI models.

Deep Learning

Support mainstream deep learning frameworks such as TensorFlow, Caffe, and Torch. Also support GPU distributed computing using either one machine and multiple cards or both multiple machines and cards.

Excellent Performance

Embed with a large number of CPU/GPU/FPGA servers with 10G NICs and acceleration algorithms for distributed machine learning to provide a solid foundation for model training.

Simple and Convenient Operations

The flexible and agile command line operation mode adapts to the habits of professional users. The visual operation mode allows drag-and-drop of algorithm components to implement service logic. The interface is user-friendly and easy to use.

Multi-terminal, Multi-scenario Data Aggregation
Providing Comprehensive and Accurate Data Support
Data collection from cameras, IoT devices, PLCs, and other channels
Data integration in multiple scenarios, such as ports, airports, campuses, and communities
Systematic Data Labels, Supporting In-time Service Query
Based on service models and scenarios, users can attach labels to data to facilitate subsequent data analysis
Unified Data Base, Supporting Agile Upper-Layer Development
The unified data standard ensures orderly data aggregation Standard data can be obtained in an agile manner from the unified resource pool to quickly respond to changeable requirements
Contact Us
Upload business card
Service & Support

Quick identificationof abnormal indicators

Effective diagnosis of fault causes

24/7 fast response

远程支持服务1 远程支持服务2 远程支持服务3
If you require in-depth consulting, please contact our O&M team.
After-sales tel.: +86-21-33356862(UTC+8) 9:00 to 18:00 from Monday to Sunday
Back to