Opsmas 2025 Day 10: lematt & pai
TOC:
lematt
I originally wrote lematt in 2017 to manage lets encrypt cert libraries more “professionally” than just letting random web servers or proxies self-create certs with no tracking or logging or backups.
basically, if you aren’t using a 100% managed service like AWS load balancers with self-provisioning certs, you shouldn’t let your random infrastructure components just create untracked keys and certs which you can’t recover on-demand. but what do i know? the entire tech world is run by 24 year old billionaire boys these days apparently so it’s not like experience in infrastructure or performance or security or platform management matters anymore.
nevertheless, i persist. This year I updated the lematt platform to use more modern code organization, a few more built-in API operations instead of calling out to openssl commands directly, and better config file management.
It can do things like:
- register certs for remote hosts through port forwarding, get the certs locally, then sync them back to remote hosts
- after every cert update, automatically run a post-create command (like backing up your key+cert library because letsencrypt doesn’t give you unlmited cert requests per domain over short time periods! if there’s a bug in your systems you can get locked out of “creating a certificate” for days at a time if you don’t have backups)
- easily manage single certs or SNI certs with different config definitions
- create certs with “safe concurrency” so if you have 100 hosts it can generate certs for them in 10 seconds instead of 90 seconds total
- auto-generate systemd timers and notification events for auto-running renewals
- can run daily to just renew certs when they are within N days of expiring
- invesigate current hosts for days-to-expiration etc
pai
After the chatgpt API was released in early 2023, I wrote a tiny interactive terminal client for automatically logging and managing chats called gptwink but it never had enough features and needed a more stable foundation to improve.
So this year I made pai (polyglot-ai) as a new interactive terminal client with even more features and goals:
- automatically logs all chats and turn interactions to multiple formats
- easy model switching
- easy provider switching (API keys and endpoint URLs)
- model concurrency features
- generate multiple prompts for running against one model concurrently
- run an “arena” where you provide a single input prompt then get replies from N configured models each configured with their own system prompts (so you can have models “argue” or “debate” each other from different assumptions up front)
- massive debug features
- live streaming event printing and debugging
- tool call debugging
- full API history viewing
- switching models within a single chat
- saving/recalling prompt templates/libraries
- tool usage both “real” and synthetic
- splits out thinking/reply/tool call/tool call reply/answer (as best as possible)
- a “smooth streaming” mode adaptively buffering data to guarantee constant rate token display to avoid providers having “fast streaming, pause/delay, another burst” so it’s less frustrating and you get tokens at a consistent predicatable streaming rate instead of with random latency.
- easily switch between “text completion” (raw model) and “chat completion” (user/assistant role switching with chat template application mode)
- apply custom chat format locally then send to the “text completion” endpoint instead of using a provider’s “chat completion” endpoint
- and lots more!
examples below.
the terminal interface changed over the year, where originally I didn’t have the model name in the output so some of the earlier screenshots are vague as to which model is actually being consulted (but it’s still all in the logs somewhere; if anybody cares i can look up more details from which models were doing which outputs below).
evidence
we haven’t done a screenshot picture gallery in a while. below is a mix of general AI screenshots from the year and also specific pai notable sample sssions all mixed together. enjoy.