Architecture

SYNTHOS processes every input through a 7-layer cognitive pipeline. Each layer is a standalone Python module with its own regex geometry, data structures, and visualization.

Pipeline Overview

USER INPUT
    │
    ├─► L0  LPE   Lexical Primitive Engine      60+ regex primitives
    │              ·  ───  ○  ∞  ⑂  ⌈⌋
    ├─► L1  GPL   Geometric Parse Lattice       4×4 cell grid
    │              ┌──┐→┌──┐→┌──┐→┌──┐
    │              └──┘  └──┘  └──┘  └──┘
    ├─► L2  SCM   Semantic Construct Manifold    named captures → concepts
    │              (?P<genus>\w+)  →  ⊂  ≡  ↔
    ├─► L3  TAM   Topological Attention Mesh     8-head pattern intersection
    │              Q∩K = attention weight
    ├─► L4  RGE   Recursive Grammar Engine       EBNF → recursive regex
    │              STATEMENT → DIRECTIVE SUBJECT PREDICATE
    ├─► L5  SCF   State Crystallization Field    rank-3 tensor + memory
    │              [LEXICAL][SEMANTIC][TOPOLOGICAL]
    └─► L6  OPS   Output Projection Surface      substitution chains
                   s/EMIT RESPONSE/[OUTPUT]/g
                                │
                                ▼
                         STRUCTURED OUTPUT

Layer Details

L0

Lexical Primitive Engine (LPE)

synthos.layers.lpe

The atomic alphabet of cognition. Every regex construct maps to a named primitive with an ASCII geometric form.

from synthos.layers.lpe import PrimitiveRegistry

registry = PrimitiveRegistry()
p = registry.get_primitive("P001")  # GLYPH
print(p.regex)           # .
print(p.geometric_form)  # ·
print(p.description)     # Match any single character
  • 48+ primitives across 6 types: ATOMIC, QUANTIFIER, GROUP, LOOKAROUND, REFERENCE, CHARACTER_CLASS
  • Each primitive has: id, name, regex, geometric_form, description, examples, composition_rules
  • search_primitives() for fuzzy lookup, validate_regex_syntax() for pattern checking
L1

Geometric Parse Lattice (GPL)

synthos.layers.gpl

A 4×4 grid of regex parse cells with directional transitions. Input is routed spatially through the lattice.

from synthos.layers.gpl import GeometricParseLattice

lattice = GeometricParseLattice()
result = lattice.traverse_lattice("  variable=value  ")
print(result["path_taken"])  # ['AA00', 'AB00', 'AC00', 'AD00']
print(lattice.visualize_lattice())
  • 16 cells in a 4×4 grid, each with: id, pattern, type (MATCH/ROUTE/STORE/EMIT), edges, weight
  • Traversal follows edge connections; each cell's regex is tested against remaining input
  • Adjacency matrix tracks connectivity; statistics include connected components
L2

Semantic Construct Manifold (SCM)

synthos.layers.scm

Named captures become concept nodes. Seven relation types (→ ⊂ ≡ ⊃ ↔ ⊕ ⊗) weave a semantic graph.

from synthos.layers.scm import SemanticConstructManifold

scm = SemanticConstructManifold()
concepts = scm.extract_concepts_from_text("NeuralNetwork ⊂ DeepLearning")
relations = scm.extract_relations_from_text("NeuralNetwork ⊂ DeepLearning")
scm.activate_concept("NeuralNetwork", 1.0)
  • ConceptNode: concept_id, genus, differentia, abstraction_level, activation
  • RelationEdge: source, target, relation_type, weight
  • Context windows via lookahead; activation propagation across the graph (NetworkX)
L3

Topological Attention Mesh (TAM)

synthos.layers.tam

8 parallel attention heads where relevance = pattern intersection area.

from synthos.layers.tam import TopologicalAttentionMesh

tam = TopologicalAttentionMesh()
result = tam.multi_head_attention("Neural networks process data")
print(result["heads_used"])
print(result["concatenated_output"])
  • AttentionHead: query_pattern, key_pattern, value_pattern, scope (LOCAL/WINDOW/GLOBAL)
  • Intersection types: FULL_OVERLAP, PARTIAL_OVERLAP, CONTAINMENT, DISJOINT, SEQUENTIAL
  • Weights computed from intersection area / union area (Jaccard-like)
L4

Recursive Grammar Engine (RGE)

synthos.layers.rge

EBNF productions translated to recursive regex with FSM parsing.

from synthos.layers.rge import RecursiveGrammarEngine

rge = RecursiveGrammarEngine()
tree = rge.parse("DEFINE model AS transformer", "STATEMENT")
if tree:
    tree.print_tree()
print(rge.validate_grammar())
  • Built-in rules: SYNTHOS_ROOT, STATEMENT, EXPRESSION, GEOMETRY, LITERAL, IDENTIFIER, NUMBER
  • Parse trees constructed from named capture groups
  • Grammar validation checks for undefined references and unreachable rules
L5

State Crystallization Field (SCF)

synthos.layers.scf

Rank-3 symbolic tensor with 3 layers (lexical, semantic, topological) plus a memory lattice.

from synthos.layers.scf import StateCrystallizationField, LayerType
import re

scf = StateCrystallizationField()
m = re.match(r"\w+", "pattern")
scf.crystallize_match(r"\w+", m, LayerType.LEXICAL)
print(scf.compute_coherence())
print(scf.visualize_tensor())
  • State tensor: 3 layers × N rows × N cols, each cell holds a CrystalizedState
  • Memory lattice: short-term (sliding window 8), long-term (named registers), episodic (stack)
  • Coherence rules via backreference patterns; gate conditions control output emission
L6

Output Projection Surface (OPS)

synthos.layers.ops

Substitution chains transform crystallized state into structured ASCII output.

from synthos.layers.ops import OutputProjectionSurface, TemplateType

ops = OutputProjectionSurface()
output = ops.render_template(TemplateType.PARAGRAPH, {
    "TOPIC": "Attention",
    "BODY": "Pattern intersection geometry"
})
print(output)
  • SubstitutionChain: ordered list of find→replace regex rules applied sequentially
  • 5 template types: PARAGRAPH, LIST, CODE, TREE, DIAGRAM
  • Output routing maps intent keywords to template types

Design Principles

  • No weights, no training. All intelligence is structural — encoded in pattern topology.
  • Fully inspectable. Every operation is readable text. No opaque tensors.
  • Modular. Each layer can be used independently or composed into the full pipeline.
  • Extensible. Add custom primitives, lattice cells, grammar rules, attention heads, or templates.
  • Deterministic. Same input → same output (no sampling, no temperature).