Published on 19 May 2022 by Andrew Owen (6 minutes)
ReadMe.io is a popular user docs site. It has a Markdown editor, theme builder and Swagger / OpenAPI file import. It’s fast and responsive, and it looks nice. But the last time I checked, all your content goes in a bucket that you don’t have direct access to.
One upshot of this is that if you want to make changes across documents, such as changing a product name, you have to edit documents individually. It would be much nicer if you could directly access the repository in VS Code and make site-wide changes.
Even though this isn’t supported, there are a couple of features in Readme.io that can enable you to take control of your data. First, you can export your content. Second, Readme.io has an API. There are some tasks that you’ll still have to perform in the web interface. But for the most part, you can keep your data in a Git repository, make local changes, and use the API to push your changes to your live site.
When I was getting started, I found it useful to use Postman to query the Readme.io site. You can save this collection as readme.postman_collection.json
:
{
"info": {
"_postman_id": "<!--insert your postman ID here-->",
"name": "readme.io",
"schema": "[https://schema.getpostman.com/json/collection/v2.1.0/collection.json](https://schema.getpostman.com/json/collection/v2.1.0/collection.json "https://schema.getpostman.com/json/collection/v2.1.0/collection.json")"
},
"item": \[
{
"name": "Get doc",
"request": {
"method": "GET",
"header": \[\],
"url": {
"raw": "[https://dash.readme.com/api/v1/docs/](https://dash.readme.com/api/v1/docs/ "https://dash.readme.com/api/v1/docs/"){{slug}}",
"protocol": "https",
"host": \[
"dash",
"readme",
"com"
\],
"path": \[
"api",
"v1",
"docs",
"{{slug}}"
\]
}
},
"response": \[\]
},
{
"name": "Get category ID",
"request": {
"method": "GET",
"header": \[\],
"url": {
"raw": "[https://dash.readme.com/api/v1/docs/get-category-id](https://dash.readme.com/api/v1/docs/get-category-id "https://dash.readme.com/api/v1/docs/get-category-id")",
"protocol": "https",
"host": \[
"dash",
"readme",
"com"
\],
"path": \[
"api",
"v1",
"docs",
"get-category-id"
\]
}
},
"response": \[\]
},
{
"name": "Update doc",
"request": {
"method": "PUT",
"header": \[\],
"body": {
"mode": "raw",
"raw": "{\\n \\"title\\": \\"{{title}}\\",\\n \\"excerpt\\": \\"{{excerpt}}\\",\\n \\"category\\": \\"{{category}}\\",\\n \\"hidden\\": {{hidden}},\\n \\"body\\": \\"{{body}}\\"\\n}",
"options": {
"raw": {
"language": "json"
}
}
},
"url": {
"raw": "[https://dash.readme.com/api/v1/docs/](https://dash.readme.com/api/v1/docs/ "https://dash.readme.com/api/v1/docs/"){{slug}}",
"protocol": "https",
"host": \[
"dash",
"readme",
"com"
\],
"path": \[
"api",
"v1",
"docs",
"{{slug}}"
\]
}
},
"response": \[\]
},
{
"name": "Delete doc",
"request": {
"method": "DELETE",
"header": \[\],
"url": {
"raw": "[https://dash.readme.com/api/v1/docs/](https://dash.readme.com/api/v1/docs/ "https://dash.readme.com/api/v1/docs/"){{slug}}",
"protocol": "https",
"host": \[
"dash",
"readme",
"com"
\],
"path": \[
"api",
"v1",
"docs",
"{{slug}}"
\]
}
},
"response": \[\]
},
{
"name": "Create doc",
"request": {
"method": "POST",
"header": \[\],
"body": {
"mode": "raw",
"raw": "{\\n \\"title\\": \\"{{title}}\\",\\n \\"excerpt\\": \\"{{excerpt}}\\",\\n \\"category\\": \\"{{category}}\\",\\n \\"hidden\\": true,\\n \\"body\\": \\"{{body}}\\"\\n}",
"options": {
"raw": {
"language": "json"
}
}
},
"url": {
"raw": "[https://dash.readme.com/api/v1/docs](https://dash.readme.com/api/v1/docs "https://dash.readme.com/api/v1/docs")",
"protocol": "https",
"host": \[
"dash",
"readme",
"com"
\],
"path": \[
"api",
"v1",
"docs"
\]
}
},
"response": \[\]
},
{
"name": "Search docs",
"request": {
"method": "POST",
"header": \[\],
"url": {
"raw": "[https://dash.readme.com/api/v1/docs/search?search=sphinx](https://dash.readme.com/api/v1/docs/search?search=sphinx "https://dash.readme.com/api/v1/docs/search?search=sphinx")",
"protocol": "https",
"host": \[
"dash",
"readme",
"com"
\],
"path": \[
"api",
"v1",
"docs",
"search"
\],
"query": \[
{
"key": "search",
"value": "sphinx"
}
\]
}
},
"response": \[\]
}
\],
"auth": {
"type": "basic",
"basic": \[
{
"key": "username",
"value": "{{apiKey}}",
"type": "string"
}
\]
},
"event": \[
{
"listen": "prerequest",
"script": {
"type": "text/javascript",
"exec": \[
""
\]
}
},
{
"listen": "test",
"script": {
"type": "text/javascript",
"exec": \[
""
\]
}
}
\],
"variable": \[
{
"key": "category\\n",
"value": "<!--get the category ID from the web interface-->"
},
{
"key": "slug",
"value": "sandbox"
},
{
"key": "title",
"value": "Sandbox"
},
{
"key": "hidden",
"value": "true"
},
{
"key": "body",
"value": "Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum."
}
\]
}
Import the collection to Postman and use Basic Auth with the API key as the username and an empty password.
This endpoint requires these attributes:
The hidden
attribute is always set to true
. The final publishing stage after reviewing the content as it will appear is to set this to false
.
Unless I missed something, when I originally came up with this solution, it was only possible to create categories on Readme.io. But there is an API for that. I don’t currently have an active Readme.io subscription to test, so I’ll leave it to you to add those endpoints to the Postman collection.
Articles can be published under existing articles. In this case, you should do a GET
on the parent article to get the category ID.
Readme.io stores articles in Markdown format (with a YAML header). But the API requires the data in JSON format. Fortunately, there’s a command line tool call JQ that can encapsulate the Markdown in a JSON string. This will form the body
attribute. From the CLI, enter: jq -R -s . < filename.md > filename.txt
.
I wrote a Bash script to automatically updates all the .md
files in a repository using the PUT
method. It’s a bit of a hack. It requires the $AUTH
token to be defined, and it will only work on files where the header is exactly eight lines.
\---
title: "article title"
category: "category UUID"
excerpt: "article description"
hidden: true
createdAt: "2022-05-19T00:00:00.000Z"
updatedAt: "2022-05-19T10:00:00.001Z"
\---
By default, articles download from Readme.io include slug
metadata but no category
metadata. Since the slug is derived from the filename, you need to replace the slug
metadata with the category
metadata and give it the appropriate UUID for the category heading it appears under.
Note: If you don’t include an excerpt
definition, you must ensure the body
text starts on line 9
of the file.
Here’s the Bash script. Save it as md2json.sh
in the v1.0
folder of your exported site.
export AUTH="<!--your AUTH token-->"
for subdir in *; do
test -d "$subdir" || continue
echo $subdir
cd "$subdir"
for f in _.md; do
export SLUG=${f%%._}
sed '1,8d' $f > "$SLUG.tmp"
sed '6,$d' $f > "$SLUG.yml"
jq -R -s . < "$SLUG.tmp" > "$SLUG.bdy"
rm "$SLUG.tmp"
printf "body: " >> "$SLUG.yml"
cat "$SLUG.bdy" >> "$SLUG.yml"
yq < "$SLUG.yml" > "$SLUG.json"
rm "$SLUG.bdy"
rm "$SLUG.yml"
curl -X PUT -H 'Content-Type: application/json'
\-H "Authorization: Basic $AUTH"
\-d "$(<$SLUG.json)"
https://dash.readme.com/api/v1/docs/$SLUG
rm "$SLUG.json"
done
cd ..
done
The last part of the puzzle is managing images. At the time of writing, I don’t know a way to upload images using the API. My suggestion is that you keep a hidden article on the website and add images to it using the web interface. Readme.io will rename the file and assign it an ID. You can then extract this information from the article. I’d then add a copy of the image with this modified filename to your repository (in case you ever want to migrate to a different docs solution). This will also enable you to preview your content locally before pushing it to the live site.