DBNN: a BNN Simulation Framework in Python
DBNN: a BNN Simulation Framework in Python

DBNN: a BNN Simulation Framework in Python

Summary

Problem: Difficulty prototyping analog biomolecular neural networks before wet-lab validation.

Approach: Implements perceptron-like behavior via chemical reaction networks, using ODE-based simulations (single- and multi-layer) mirroring the Weiss Lab's BNN architecture, all in Python notebooks.

Impact: Enables scalable in silico testing of biomolecular classifiers (ie XOR) to accelerate synthetic biology design and diagnostics pipelines.

GitHub repo: Embed GitHubEmbed GitHub

DBNN

DBNN is a BNN simulation framework to prototype and test biomolecular neural circuits in silico before transitioning to lab implementation.

> Includes simulation notebooks, numerical integration of ODEs, and demonstrations of single-layer and multilayer biomolecular classifiers.

image

Project

This project implements a dynamical model of a biochemical feedforward neural network using chemical reaction networks, directly inspired by the paper A Dynamical Biomolecular Neural Network from the Weiss Lab at MIT. The design models core neural operations- like weighted summation and non-linear activation- using molecular sequestration reactions. These biomolecular perceptrons can be composed into multilayer networks to perform both linear and non-linear classification tasks (ie XOR), all while preserving system stability at any scale. That scale has yet to be seen (maybe in a future project…).

Why it matters

Unlike traditional logic-based genetic circuits, BNNs support analog computation, which is more efficient for biological systems with graded molecular concentrations. This enables new forms of cellular decision-making, potentially useful for diagnostics, therapeutics, or synthetic learning systems.