Alexios Bluff Mara LLCdba Red Team KitchenChicago, IL · BloNo research node
Brain-response analysis. Personal AI. Built where Cardinal meets Code.
Alexios Bluff Mara LLC is a research-and-software lab
operating in collaboration with Illinois State University. Two flagship
projects, both running live right now: Cortex, a multimodal
brain-response analysis system on Meta's TRIBE v2 + Google's Gemma 4, and
Mercury, a personal AI agent reachable from terminal,
Discord bot, and a phone-friendly web dashboard.
Alexios Bluff Mara LLC partners with the
ISU College of Applied Science & Technology
on applied AI and computational-neuroscience research only
(not coursework). The work is hardware-flexible by design: the same code path
runs on a single RTX 5090 in our Chicago lab (call sign
Ascended Base), on the M4 Max MacBook (Big Apple) sitting next to it, and —
through ACCESS-CI partnerships at UIUC and similar institutions — on national
HPC nodes when the experiment demands them. Open-source where possible.
Local-first by default.
Active named collaborators at ISU CAST:
Mangolika Bhattacharya, Sally Xie, and
Rosangela Follmann (early-stage, one meeting). Full bios on
/about.
"From a single Blackwell card to an HPC partition — same code, different
horsepower. The research question dictates the hardware, not the other
way around."
Below are the ISU departments and facilities the lab interacts with.
None of these endorse the lab's work; they are listed because they are
the academic homes the research draws from and contributes to.
Pulled from the live project READMEs. No extrapolation, no cloud bills
rounded up.
$22.04
Kimi K2.6 sprint cost
75 min
Sprint duration · 14 commits
20,484
Cortical vertices
6
Mercury surfaces
194
Tok/sec on RTX 5090
~6 min
Per-scan turnaround
Work with us
Alexios Bluff Mara LLC is actively seeking research partners, compute
sponsors, and early investors aligned with local-first AI infrastructure
and computational neuroscience.
We hold active research association with Illinois State University.
Faculty co-PIs in neuroscience or CS can unlock NAIRR Start-Up
GPU allocations (A100/H100 at NCSA/UIUC) within ~3 weeks of
submission. If you are an ISU faculty member interested in fMRI
prediction or personal AI, reach out.
Google Cloud / VC credits
Applying via 1871 Chicago accelerator network.
The Google for Startups AI Tier offers up to $350K in GCP credits
for startups using Gemini/Vertex as core infrastructure, accessed
through partner accelerators. ABM LLC is a Chicago-based AI
research startup. If your accelerator or VC fund has a Google
partnership, let's talk.
AI research startup seeking $250K–$500K pre-seed for SaaS scan access, HPC marketplace, and ISU research partnerships. Email soumit@redteamkitchen.com subject INVEST.
Google for Startups AI Tier via 1871 Chicago
Up to $350K in GCP credits through 1871 Chicago.
Up to $350K in GCP credits for Google-integrated AI startups through the 1871 Chicago accelerator network. Apply via cloud.google.com/startup/ai.
NAIRR compute grant (ISU faculty co-PI)
A100/H100 GPU time at NCSA/UIUC via NAIRR.
National AI Research Resource: A100/H100 GPU time at NCSA/UIUC, ~3-week turnaround. Requires ISU faculty co-PI. Contact soumit@redteamkitchen.com.
What Mercury says when you ask it who it is
The output below is from a real Mercury terminal session running on Seratonin
(the local 5090). Same agent answers identically through the Discord bot
(@abmsnowy) and the phone-friendly web dashboard.
$ mercury chat
[mercury] connected to local Ollama (gemma4:e4b @ 194 tok/s)
[mercury] session resumed · 1,247 prior messages in context
› who are you?
Identity & Architecture
Agent: Snowy The Bot / ABM Hermes
Developer: Alexios Bluff Mara LLC / Red Team Kitchen
in collaboration with Illinois State University 🔴
Execution Environment: Local RTX 5090 via Ollama (direct, low-latency,
high-power access to GPU resources)
Model: Gemma 4 E4B (194 tok/s)
Cost / Limits: No cloud API costs, no rate limits
Persistence: Persistent memory across sessions
Software Capabilities (Skills & Tools)
Web/Browser: web_search, web_extract, browser suite (Navigation, Vision, Interaction)
Code/Dev: terminal, read_file, search_files, patch, write_file, github,
systematic-debugging, test-driven-development
Data/MLOps: data-science, mlops (TRL/Unsloth fine-tuning, vLLM inference,
SAM segmentation), research (arXiv, literature review)
Memory/Workflow: memory (user/general facts), skills_list, session_search,
delegate_task for orchestrating multi-step workflows
Conclusion
Locally hosted, powerful, state-of-the-art AI agent designed for
uninterrupted, resource-intensive, multi-domain work. Primary vulnerability
surface: inability to run standard system diagnostic commands in the
immediate shell environment. All operational parameters (GPU, memory, model
throughput) are confirmed internally.
What we have, and what we'd build next with funding
The next surface for both Cortex and Mercury is spatial — head-mounted
displays where a brain-response visualization actually wraps around the
viewer instead of being trapped in a browser tab. Three platforms matter
here, and we already own two of them.
Hardware on hand · today
Meta Quest 3
$499 standalone XR headset, 4K+ resolution per eye (2064×2208), Snapdragon
XR2 Gen 2, full-color passthrough. We have one in the lab. Meta's
Quest for Business developer program issues device-management
seats and the Meta Horizon Developer + Meta for
Developers tracks both publish quarterly grant calls and SDK access.
Rendering Cortex's 20,484-vertex brain mesh in WebXR on Quest 3 is
the cheapest credible spatial demo on the public market.
Hardware on hand · today
Apple Vision Pro
$3,499 spatial computer with dual 4K micro-OLED, M2 + R1 chips, true 6DoF
eye + hand tracking. We have one. Apple's Apple Developer
Program ($99/yr) ships visionOS SDK + RealityKit. Apple
Entrepreneur Camp and Apple Investor in People
application tracks open periodically; pricing pressure pushes us to ship the
AVP demo first because the brand-cred per visit is the highest of the three.
Wishlist · with funding
Google Android XR (Samsung Galaxy XR)
Samsung's Project Moohan / Galaxy XR headset, expected ~$1,800
retail (Samsung's Snapdragon XR2+ Gen 2). Runs Android XR
with Gemini and Google's full ecosystem. Highest fidelity at the price-
per-pixel point, ideal for an education-sector deployment
— undergrad neuro / cognitive-science labs at ISU + UIUC could buy ten of
these for the cost of one Vision Pro. Google's Android XR
Developer Program is in beta; Google for Startups
and Google Cloud for Startups both offer up to $200K
in cloud credits to qualifying companies.
Wishlist · with funding · long shot
Google AR Glasses (XR-lite, when it ships)
Google's AR glasses (announced 2026, partner-distributed via
Warby Parker / Gentle Monster / Samsung) are the form factor that actually
gets a brain-scan demo into a classroom rather than a lab. Pricing TBD,
likely $1,200-$2,000. We're in the queue for developer-program access; in
the meantime the same WebXR build that ships to Quest will graceful-degrade
to glasses-class hardware.
Cost-to-ship math: Quest 3 + AVP combined puts the lab at
~$4,000 of XR hardware on hand. Adding Galaxy XR + Google Glasses would be
another ~$3,000-$3,800. Total spatial-computing capex to cover all
four major platforms: under $8,000. That's the whole BOM for an
education-sector spatial-neuroscience pilot. Compare to the cost of one
Hospital MRI hour ($600-$1,200) and the conversation gets short.
Research conducted in association withIllinois State University,
research collaboration · Bloomington–Normal, IL · ABM in Chicago, IL.