My Stand-Off with LM Studio’s HTTP Server Configuration File
2025-08-17 : Today I had the opportunity to setup Ollama & LM Studio on an Ubuntu 24.04 Desktop machine. The Ubuntu was installed the night before (2025-08-16).
LM Studio : 0.3.23 (Build 3)
The LM Studio Docs show to start the LM Studio http daemon as lms server start. When I issued this command on the first few rounds, I got an error that read something like …
Error: ENOENT: no such file or directory, open '/home/<user>/.lmstudio/.internal/http-server-config.json'
After some Google-ing, I discovered that ENOENT stands for Error NO ENTry, and it appears to be some standard exception code that is also thrown in other systems.
The ENOENT error seems to indicate a mising configuration file. The error kept showing up even after many attempts. Checking for the presence of http-server-config.json in ~/.lmstudio/.internal shows that it does not exist.
I then discovered that specifying a port number allows me to successfully start the service …
lms server start --port 3000
Although the service starts, I checked the location again for the presence of the configuration file but still found none.
I reasoned that the only parameter required to start the server in the case of a missing http-server-config.json configuration file is the port number. Surely, there must be other configuration parameters. Only after some digging, did I find a sample http-server-config.json file with other parameters.
The Ubuntu machine was eventually shut down & I brought the unit back home. Had my shower, freshened up, and eager to check on the issue. When I started up the machine again and tried lms server start (without the --port 3000 parameter), to my surprise it started. So I checked for the presence of the file, and lo-and-behold, there it was, created just a few seconds prior.
These are the contents of my http-server-config.json file …
"autoStartOnLaunch": false,
"port": 3000,
"cors": false,
"logSensitiveData": true,
"logIncomingTokens": false,
"verbose": false,
"logLinesLimit": 500,
"networkInterface": "127.0.0.1",
"justInTimeModelLoading": true,
"fileLoggingMode": "succinct"
}
A few steps along my journey to Eternity. I’ll never know what’s
around the corner. Let me embrace serendipity and be thankful that I can
experience the events that life presents.
References
https://lmstudio.ai/docs/cli
https://www.tecmint.com/lm-studio-run-llms-linux/
https://github.com/lmstudio-ai/lmstudio-bug-tracker/issues/415
https://stackoverflow.com/questions/19902828/why-does-enoent-mean-no-such-file-or-directory
https://stackoverflow.com/questions/43260643/how-to-resolve-node-js-error-enoent-no-such-file-or-directory