A quick and dirty REST API for noobs

Recently, I found myself in yet another problem of my own making... This one required me to query an external data source. In this case, it was JSON data, but it could have easily been any other datatype. Since I was testing an application that I'm currently in the middle of developing, and haven't nailed down what the structure of the actual data was going to look like in the end, I needed to figure out how to setup some kind of server to actually serve my (test) data.

I need what now?

Given that I use Docker for most everything these days, I went out in search of a way to be able to get this data up somewhere, and actually sent new data to it. After scrounging around on Github, Stack Overflow, and eventually ending up on page 23-something on Google, all signs pointed to setting up some kind of a RESTful API server. More googling led me to almost giving up, since most solutions were either hosted ($$$), had a steep enough learning curve that didn't make it necessarily "quick", or was overly complicated to setup. Then I ran across json-server.

The description claimed "Get a full fake REST API with zero coding in less than 30 seconds (seriously)", but we all know how that goes. I looked through a few of the linked resources on the landing page, but wasn't sold. That's when I ran across the post over at CodingtheSmartWay which stepped through the process of doing everything locally with npm. I'm a sucker for a good walkthrough, but being a few years old, I still wasn't all in.

Besides, I wanted to run this in Docker, right? No sense in doing a bunch of CLI steps when something could be reproduced with a docker container, right? I already have Portainer, Traefik, and everything else setup on that particular test server, so further hunting led me to docker-json-server, which was more of what I was looking for in the first place.

Getting things moving

Taking into account that every setup is different, I took the initial (basic) docker commands outlined and pulled together my docker-compose.yml file:

version: '3.3'
networks:
  proxy:
    external: true
services:
  json-server:
    container_name: json-server
    image: clue/json-server
    restart: unless-stopped
    ports:
      - '8954:80'
    labels:
      - 'traefik.enable=true'
      - 'traefik.backend=json-server'
      - 'traefik.frontend.rule=Host:json-server.mydomain.com'
      - 'traefik.docker.network=proxy'
      - 'traefik.port=80'
    volumes:
      - './data:/data'
    networks:
      - proxy

I plopped in my compose template and adjusted a few things and the above is what I got. The only "external" things that I needed to do were:

  • setup the CNAME for the domain and plop it into the compose file (since I'm already running Traefik, the routing is trivial via labels
  • run a mkdir data in the root dir of my project
  • run docker-compose up -d

Initially, I had the db.json file specified in the volumes in my compose file as ./data/db.json:/data/db.json and had initially seeded it with a basic JSON entry, but it wasn't particularly happy with that arrangement. After tinkering with it for a little while, I figured out that mapping the whole data dir worked, but ONLY after I had a db.json file in there with the correct permissions.

So basically, my initial try looked like this:

Run the container initially with my created data dir and db.json file inside, that I created myself.

The volume was mapped as ./data/db.json:/data/db.json.

Upon loading the container, a tmp file called .db.json was created in my data directory. That's where my data was being written to instead of my db file.

well shit

Fixing things

Upon further inspection, I realized (as most volume mapping issues go) that the permissions weren't set correctly on my data directory in the first place. I'd imagine that if I let the container run once (without specifying anything or setting up my own db.json file), I wouldn't have had this problem. That's an experiment for next time.

However, upon shutting down the container, deleting my initial db.json file and re-running the container I was still having the same problem. Again, chalk this one up to more permission issues...

The fix? chown -R root:root data/

Yep. Upon changing the data folder to be owned by root, everything was peachy and ran correctly. I tested out POST requests and all of my new data was updating correctly in the db.json file with zero issues.

So, in the future, I should will use the compose file that I posted above, run it once - then shut it down, edit the db.json file in the data dir, (or if it doesn't create one automatically, at least do a sudo touch data/db.json to get it most of the way there to be sure) and then add my own test data. That seems like a working combo, but YMMV there.

Fake data is great

And in development use, does it work? Sure does. It does exactly what I was looking for a quick server to do. I can edit the db.json file quickly if I need to, or edit it via requests. It emulated a much more full-featured REST API system that I'll likely have to spend the time to build out once I get to that point in my application. For now, it just does what it's supposed to do and stays out of the way. I'll call that a win.


Also of note, that post above led me to grab Postman, which is a super nifty piece of software. (Once you figure it out) Definitely makes doing GET, POST, PUT, PATCH, WHATEVER requests easy, and shows you exactly what's in your "db" quickly. Thumbs way up, there.