Speck™ 2 is a fully event-driven neuromorphic vision System-on-Chip (SoC) that integrates a dynamic vision sensor (DVS) together with a spiking neural network (sCNN) processor on a single chip.
**Overview**
Speck™ is a fully event-driven neuromorphic vision System-on-Chip (SoC) that combines a dynamic vision sensor (DVS) with a spiking neural network (sCNN) processor on a single chip. :contentReference[oaicite:0]{index=0}
- Supports large-scale asynchronous sCNNs with a configurable capacity of **320,000 spiking neurons**. :contentReference[oaicite:1]{index=1}
- Ultra-low power consumption: scene intelligence at the **milliwatt** level. :contentReference[oaicite:2]{index=2}
- Low latency: end-to-end response time down to a **few milliseconds**. :contentReference[oaicite:3]{index=3}
Key highlights:
- Supports large-scale asynchronous sCNNs with **up to 320,000 spiking neurons**
-**Ultra-low power** operation, enabling always-on scene understanding at the milliwatt level
-**Low latency**, with end-to-end response times of only a few milliseconds
- Ideal for embedded and edge intelligence applications
More details:
https://www.synsense.ai/products/speck-2/
---
**Applications**
Speck™ is designed for always-on, real-time vision tasks in embedded and edge systems. Use-cases include:
-**Ultra-low power**: 100–1000× lower energy consumption compared to conventional vision solutions. :contentReference[oaicite:9]{index=9}
-**Ultra-low latency**: < 5 ms end-to-end latency. :contentReference[oaicite:10]{index=10}
-**Privacy & security**: processes “dot-matrix” event data on-chip, reducing the need to send raw image data externally. :contentReference[oaicite:11]{index=11}
-**Advanced sCNN support**: supports a variety of spiking convolutional neural network algorithms. :contentReference[oaicite:12]{index=12}
-**Speck™ Development Kit**: includes the Speck SoC and DVS, providing a hardware development platform for embedded vision. :contentReference[oaicite:13]{index=13}
-**Software Toolchain**: SynSense provides open-source tools (e.g., **SAMNA** runtime, spiking network toolchain) that let engineers train and deploy up to 9-layer sCNNs to the chip. :contentReference[oaicite:14]{index=14}
- Ubuntu 18.04 or 20.04 :contentReference[oaicite:16]{index=16}
- 1× USB 3.0 Type-A :contentReference[oaicite:17]{index=17}
- Display + integrated or discrete GPU (for development) :contentReference[oaicite:18]{index=18}
## Key Features
-**Ultra-low power consumption** (hundreds to thousands of times lower than conventional vision systems)
-**Ultra-low latency** (< 5 ms)
-**Privacy-preserving** event-based sensing: only sparse “dot-matrix” activity is processed
-**Supports advanced spiking CNNs**, up to ~9 layers depending on model complexity
- Integrated DVS + neuromorphic processor for fully event-driven pipelines
- Designed for edge deployment in IoT and autonomous systems
---
**Why It Matters (for Bioinspired / Neuromorphic Platforms)**
-**Brain-inspired design**: Combines event-based sensing and spiking neural computation in a single chip — closely mimics how biological vision systems operate.
-**Edge intelligence**: Enables real-time vision inference on the edge, with very low power — ideal for IoT, robotics, and always-on systems.
-**Scalable SNNs**: With up to 320K neurons, it allows for sophisticated spiking CNNs, enabling research on neuromorphic vision and yielding new insights into how to map brain-inspired models to hardware.
## Development Support
SynSense provides a complete development ecosystem, including:
### Speck™ Development Kit
A hardware kit integrating the Speck SoC and event-based sensor, enabling rapid testing and deployment of neuromorphic vision applications.