Update SynSense Speck™ authored by Mazdak Fatahi's avatar Mazdak Fatahi
---
title: SynSense Speck™
---
# SynSense Speck™ — Neuromorphic Development Hit
# SynSense Speck™ — Neuromorphic Vision SoC
## Overview
Speck™ 2 is a fully event-driven neuromorphic vision System-on-Chip (SoC) that integrates a dynamic vision sensor (DVS) together with a spiking neural network (sCNN) processor on a single chip.
**Overview**
Speck™ is a fully event-driven neuromorphic vision System-on-Chip (SoC) that combines a dynamic vision sensor (DVS) with a spiking neural network (sCNN) processor on a single chip. :contentReference[oaicite:0]{index=0}
- Supports large-scale asynchronous sCNNs with a configurable capacity of **320,000 spiking neurons**. :contentReference[oaicite:1]{index=1}
- Ultra-low power consumption: scene intelligence at the **milliwatt** level. :contentReference[oaicite:2]{index=2}
- Low latency: end-to-end response time down to a **few milliseconds**. :contentReference[oaicite:3]{index=3}
Key highlights:
- Supports large-scale asynchronous sCNNs with **up to 320,000 spiking neurons**
- **Ultra-low power** operation, enabling always-on scene understanding at the milliwatt level
- **Low latency**, with end-to-end response times of only a few milliseconds
- Ideal for embedded and edge intelligence applications
More details:
https://www.synsense.ai/products/speck-2/
---
**Applications**
Speck™ is designed for always-on, real-time vision tasks in embedded and edge systems. Use-cases include:
- **Smart Toys**: gesture control, tracking. :contentReference[oaicite:4]{index=4}
- **Smart Home**: gesture-based interfaces. :contentReference[oaicite:5]{index=5}
- **Security**: fall detection, approach detection. :contentReference[oaicite:6]{index=6}
- **Automotive**: lane detection, sign recognition, driver attention, obstacle detection. :contentReference[oaicite:7]{index=7}
- **Drones**: object tracking, collision avoidance. :contentReference[oaicite:8]{index=8}
## Applications
Speck™ 2 enables real-time event-based vision for low-power systems in:
---
- **Smart toys** (gesture control, tracking)
- **Smart home** systems (gesture-based interfaces)
- **Security & safety** (fall detection, human presence, approach detection)
- **Automotive** (lane and sign detection, obstacle avoidance)
- **Drones & robotics** (object tracking, collision avoidance)
**Key Features**
- **Ultra-low power**: 100–1000× lower energy consumption compared to conventional vision solutions. :contentReference[oaicite:9]{index=9}
- **Ultra-low latency**: < 5 ms end-to-end latency. :contentReference[oaicite:10]{index=10}
- **Privacy & security**: processes “dot-matrix” event data on-chip, reducing the need to send raw image data externally. :contentReference[oaicite:11]{index=11}
- **Advanced sCNN support**: supports a variety of spiking convolutional neural network algorithms. :contentReference[oaicite:12]{index=12}
Application brochure:
https://www.synsense.ai/wp-content/uploads/2023/06/Speck-application-brochures-EN.pdf
---
**Development Support**
- **Speck™ Development Kit**: includes the Speck SoC and DVS, providing a hardware development platform for embedded vision. :contentReference[oaicite:13]{index=13}
- **Software Toolchain**: SynSense provides open-source tools (e.g., **SAMNA** runtime, spiking network toolchain) that let engineers train and deploy up to 9-layer sCNNs to the chip. :contentReference[oaicite:14]{index=14}
- **System Requirements** for the dev kit:
- ≥ 4 GB RAM :contentReference[oaicite:15]{index=15}
- Ubuntu 18.04 or 20.04 :contentReference[oaicite:16]{index=16}
- 1× USB 3.0 Type-A :contentReference[oaicite:17]{index=17}
- Display + integrated or discrete GPU (for development) :contentReference[oaicite:18]{index=18}
## Key Features
- **Ultra-low power consumption** (hundreds to thousands of times lower than conventional vision systems)
- **Ultra-low latency** (< 5 ms)
- **Privacy-preserving** event-based sensing: only sparse “dot-matrix” activity is processed
- **Supports advanced spiking CNNs**, up to ~9 layers depending on model complexity
- Integrated DVS + neuromorphic processor for fully event-driven pipelines
- Designed for edge deployment in IoT and autonomous systems
---
**Why It Matters (for Bioinspired / Neuromorphic Platforms)**
- **Brain-inspired design**: Combines event-based sensing and spiking neural computation in a single chip — closely mimics how biological vision systems operate.
- **Edge intelligence**: Enables real-time vision inference on the edge, with very low power — ideal for IoT, robotics, and always-on systems.
- **Scalable SNNs**: With up to 320K neurons, it allows for sophisticated spiking CNNs, enabling research on neuromorphic vision and yielding new insights into how to map brain-inspired models to hardware.
## Development Support
SynSense provides a complete development ecosystem, including:
### Speck™ Development Kit
A hardware kit integrating the Speck SoC and event-based sensor, enabling rapid testing and deployment of neuromorphic vision applications.
Datasheet:
https://www.synsense.ai/wp-content/uploads/2023/05/20221223_speck2e_devkit_datasheet_final.pdf
### Software Toolchain
- **SAMNA runtime** for model deployment
- Training and conversion tools to map spiking CNNs to the Speck hardware
- Supports Ubuntu 18.04 / 20.04 development environments
Minimum system requirements:
- ≥ 4 GB RAM
- Ubuntu 18.04 or 20.04
- USB 3.0 Type-A
- GPU recommended for training
---
**Further Resources**
- https://www.synsense.ai/products/speck-2/
## Why It Matters for Neuromorphic Platforms
- Combines **event-based sensing** and **spiking neural computation** — closely mirroring biological vision pathways
- Enables **real-time on-chip inference** with extremely low energy use
- Suitable for research on **bio-inspired vision**, **edge SNN deployment**, and hybrid DVS+SNN neuromorphic systems
- Provides a practical, commercially available platform for testing neuromorphic algorithms
---
*Note: All information is based on SynSense’s publicly available material as of [access date].*
## Useful Links
- Product page: https://www.synsense.ai/products/speck-2/
- Application brochure: https://www.synsense.ai/wp-content/uploads/2023/06/Speck-application-brochures-EN.pdf
- Dev kit datasheet: https://www.synsense.ai/wp-content/uploads/2023/05/20221223_speck2e_devkit_datasheet_final.pdf
- SAMNA toolchain: https://www.synsense.ai/products/samna/