I have assembled tools like free web search with langsearch instead of searxng to avoid rate limits, as well as a very thorough literature search and download tool. As well as some prototypes with examples for my scientific tools which can help with SEM, XRD, PL and handle messy data in originlab. Me and (mostly) opus have made some mcp tools for opj files (i hate these) covering batch fitting, peak decomposition, waterfall plots and plot export directly into chat which is really flashy especially for the boomer scientists. I try to run most stuff locally, for example docling, embed rerank vlm, my file server, jupyter with gpu accel (on rocm not nvidia which is what you mostly come across), metamcp with all of my tools etc. The stack is split in three profiles, 1. Lab, 2. Coder and 3. Instruments and each one has its own skill files and system prompts so that we get better efficiency and performance. And for my lab to access it i have cloudflare so that its secure somewhat (i hope). I will try to better document my path torwards doing something token efficient. I already found good success with separating my tools in packets with subtools and individual skill.md files. I really want to thank the owui community for the great lessons and tools/functions. I will try to port all of my tools in the owui community for download. This post is intended to start a conversation with other people interested in implementing tools in scientific workloads. You can take a look at the link (sorry if the github looks messy, im in the process of doing it still)