Project Archive

PromptBox | Prompt management and testing toolkit | 2025 | Public | ux design, prompt engineering, next.js |

Repo Link

PromptBox is a structured environment for managing and testing prompts. It allows users to organize a library of prompts, track versions, run side-by-side comparisons, and chain together multi-step processes. With analytics built in, it helps refine prompt effectiveness over time. PromptBox laid the foundation for PromptStudio by showing the value of structured, reusable workflows in AI development, turning prompt engineering from improvisation into a disciplined practice.


LocalMac LLM | Tiny transformer trained on Apple Silicon | 2025 | Public | AI/LLMs, machine learning, transformers, python, apple mlx |

Repo Link

LocalMac LLM is a small-scale machine learning project designed to make the process of training a language model fully transparent and approachable. I wanted to learn how transformer architectures are made. It demonstrates every stage of the pipeline, from tokenizer building to model training and text generation, all running on an Apple Silicon Mac. The project uses a ~1.5M parameter GPT-style architecture and is tuned for educational clarity, making it an ideal resource for anyone curious about how LLMs work at the most fundamental level.


MemeMap | AI-tagged maps for humor, facts and travel | 2025 | Public | react, typescript, gemini ai api, ui/ux design, leaflet.js |

Repo Link

MemeMap is an interactive map that generates playful or informative place tags based on the user’s chosen topic and style. With modes ranging from humorous to factual to travel-oriented, it reimagines maps as a space for cultural commentary and creative exploration. The app demonstrates how light AI integration with a clean React interface can turn a familiar tool into something expressive, interactive, and fun. Inspired by Hoodmaps.


Simple MPC | Simple music making workflow exploration | 2025 | Public | html/css/js, ui/ux design, interaction design |

LocalMac LLM is a small-scale machine learning project designed to make the process of training a language model fully transparent and approachable. I wanted to learn how transformer architectures are made. It demonstrates every stage of the pipeline, from tokenizer building to model training and text generation, all running on an Apple Silicon Mac. The project uses a ~1.5M parameter GPT-style architecture and is tuned for educational clarity, making it an ideal resource for anyone curious about how LLMs work at the most fundamental level.

Previous
Previous

MYCO MASK