24 Commits

Author SHA1 Message Date
29bdad09bf reduced Docker image size
All checks were successful
CI / Build Pip package (push) Successful in 16s
CI / Build Docker image (push) Successful in 3m6s
2024-10-25 07:25:13 +08:00
16dbd3a82a addded nginx confioguration file
All checks were successful
CI / Build Pip package (push) Successful in 16s
CI / Build Docker image (push) Successful in 3m33s
2024-10-24 23:55:21 +08:00
33a3858b02 added cli 2024-10-24 23:26:06 +08:00
e2e4083321 switch from aiohttp to httpx 2024-10-24 23:18:23 +08:00
2ca7f8e908 updated README.md
All checks were successful
CI / Build Pip package (push) Successful in 19s
CI / Build Docker image (push) Successful in 3m44s
2024-10-23 21:09:11 +08:00
7d9cae76c1 added plantUML support
fixed bug with subscription being notified multiple times
2024-10-23 20:57:20 +08:00
a4b3d10c66 markdown rendering offloaded to the default threadpool
Some checks failed
CI / Build Pip package (push) Failing after 14s
CI / Build Docker image (push) Successful in 1m1s
2024-10-22 23:13:22 +08:00
011042bc15 fixed async file reads
All checks were successful
CI / Build Pip package (push) Successful in 15s
CI / Build Docker image (push) Successful in 3m27s
2024-10-21 19:45:41 +08:00
5229ed167b fixed Docker image build
Some checks failed
CI / Build Pip package (push) Failing after 15s
CI / Build Docker image (push) Failing after 51s
2024-10-21 14:13:29 +08:00
5fde70418b fixed version retrieval in Gitea pipeline
Some checks failed
CI / Build Pip package (push) Successful in 31s
CI / Build Docker image (push) Failing after 14s
2024-10-21 13:50:06 +08:00
ec45c719ee added 'aiofiles' to make file operation async
Some checks failed
CI / Build Pip package (push) Successful in 32s
CI / Build Docker image (push) Failing after 14s
2024-10-21 13:40:19 +08:00
94670e5aaf added dynamic version support and solved loop call issue
Some checks failed
CI / Build Pip package (push) Successful in 15s
CI / Build Docker image (push) Failing after 7s
2024-10-21 06:51:41 +08:00
707c74f042 renamed project to Bugis and switch from WSGI@uwsgi to ASGI@granian
Some checks failed
CI / Build pip package (push) Failing after 13s
CI / Build Docker image (push) Successful in 4m36s
2024-10-20 20:35:46 +08:00
49a9bad07f updated Docker build 2024-08-16 06:24:28 +08:00
ef8da4e6cc added prefix option 2023-10-20 15:35:09 +08:00
40bd2111bf support deployment in relative url 2023-10-20 08:19:15 +08:00
335c2ddd7f removed unused module 2023-10-19 22:05:38 +08:00
3626cd7980 added hot reload and Gevent to uwsgi server 2023-10-19 21:57:29 +08:00
67948f81c4 Added README.md 2023-05-22 17:21:15 +08:00
270767c3cd replaced inotify with watchdog 2021-11-09 09:25:39 +00:00
26f7909c33 usage socketserver for better compatibility with older Python versions 2020-03-15 15:19:12 +00:00
2b71048b65 improved Docker image build 2020-03-15 14:31:48 +00:00
5e9eaba794 added graphviz support 2020-03-13 20:48:26 +00:00
1e75eaf836 added handling of mutable static resources 2020-03-03 14:17:22 +00:00
35 changed files with 1683 additions and 399 deletions

View File

@@ -0,0 +1,71 @@
name: CI
on:
push:
tags:
- '*'
jobs:
build_pip_package:
name: "Build Pip package"
runs-on: woryzen
steps:
- name: Checkout sources
uses: actions/checkout@v4
with:
fetch-depth: 0
fetch-tags: true
- uses: actions/setup-python@v5
with:
cache: 'pip'
- name: Create virtualenv
run: |
python -m venv .venv
.venv/bin/pip install -r requirements-dev.txt
- name: Execute build
run: |
.venv/bin/python -m build
- name: Publish artifacts
env:
TWINE_REPOSITORY_URL: ${{ vars.PYPI_REGISTRY_URL }}
TWINE_USERNAME: ${{ vars.PUBLISHER_USERNAME }}
TWINE_PASSWORD: ${{ secrets.PUBLISHER_TOKEN }}
run: |
.venv/bin/python -m twine upload --repository gitea dist/*{.whl,tar.gz}
build_docker_image:
name: "Build Docker image"
runs-on: woryzen
steps:
- name: Checkout sources
uses: actions/checkout@v4
with:
fetch-depth: 0
fetch-tags: true
- name: Get package version
run: |
echo VERSION=$(python -m setuptools_scm) >> "$GITHUB_ENV"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3.4.0
with:
driver: docker-container
platforms: |
linux/amd64
linux/arm64
- name: Login to Gitea container registry
uses: docker/login-action@v3
with:
registry: gitea.woggioni.net
username: woggioni
password: ${{ secrets.PUBLISHER_TOKEN }}
- name: Build and push bugis images
uses: docker/build-push-action@v6
with:
context: .
platforms: |
linux/amd64
linux/arm64
push: true
pull: true
tags: |
"gitea.woggioni.net/woggioni/bugis:latest"
"gitea.woggioni.net/woggioni/bugis:${{ env.VERSION }}"
cache-from: type=registry,ref=gitea.woggioni.net/woggioni/bugis:buildx
cache-to: type=registry,mode=max,compression=zstd,image-manifest=true,oci-mediatypes=true,ref=gitea.woggioni.net/woggioni/bugis:buildx

7
.gitignore vendored Normal file
View File

@@ -0,0 +1,7 @@
.venv
__pycache__
*.pyc
src/bugis/_version.py
*.egg-info
/build
/dist

42
Dockerfile Normal file
View File

@@ -0,0 +1,42 @@
FROM alpine:latest AS base
LABEL org.opencontainers.image.authors=oggioni.walter@gmail.com
RUN --mount=type=cache,target=/var/cache/apk apk update
RUN --mount=type=cache,target=/var/cache/apk apk add python3 py3-pip graphviz
FROM base AS build
RUN --mount=type=cache,target=/var/cache/apk apk add musl-dev gcc graphviz-dev git
RUN adduser -D luser
USER luser
WORKDIR /home/luser
COPY --chown=luser:users ./requirements-dev.txt ./requirements-dev.txt
COPY --chown=luser:users ./requirements-run.txt ./requirements-run.txt
WORKDIR /home/luser/
RUN python -m venv .venv
RUN --mount=type=cache,target=/home/luser/.cache/pip,uid=1000,gid=1000 .venv/bin/pip wheel -w /home/luser/wheel pygraphviz
RUN --mount=type=cache,target=/home/luser/.cache/pip,uid=1000,gid=1000 .venv/bin/pip install -r requirements-dev.txt /home/luser/wheel/*.whl
COPY --chown=luser:users . /home/luser/bugis
WORKDIR /home/luser/bugis
RUN rm -rf .venv dist build
RUN --mount=type=cache,target=/home/luser/.cache/pip,uid=1000,gid=1000 /home/luser/.venv/bin/python -m build
FROM base AS release
RUN mkdir /srv/http
RUN adduser -D -h /var/lib/bugis -u 1000 bugis
USER bugis
WORKDIR /var/lib/bugis
COPY --chown=bugis:users conf/pip.conf ./.pip/pip.conf
RUN python -m venv .venv
RUN --mount=type=cache,target=/var/bugis/.cache/pip,uid=1000,gid=1000 --mount=type=bind,ro,from=build,source=/home/luser/requirements-run.txt,target=/requirements-run.txt --mount=type=bind,ro,from=build,source=/home/luser/wheel,target=/wheel .venv/bin/pip install -r /requirements-run.txt /wheel/*.whl
RUN --mount=type=cache,target=/var/bugis/.cache/pip,uid=1000,gid=1000 --mount=type=bind,ro,from=build,source=/home/luser/bugis/dist,target=/dist .venv/bin/pip install /dist/*.whl
VOLUME /srv/http
WORKDIR /srv/http
ENV GRANIAN_HOST=0.0.0.0
ENV GRANIAN_PORT=8000
ENV GRANIAN_INTERFACE=asgi
ENV GRANIAN_LOOP=asyncio
ENV GRANIAN_LOG_ENABLED=false
ENTRYPOINT ["/var/lib/bugis/.venv/bin/python", "-m", "granian", "bugis.asgi:application"]
EXPOSE 8000/tcp

23
README.md Normal file
View File

@@ -0,0 +1,23 @@
## How to
### Build the docker image
```bash
docker build -t gitea.woggioni.net/woggioni/bugis:latest .
```
### Build the pip package
```bash
pyproject-build
```
### Run in docker
```bash
docker run --rm -v /your/document/directory:/srv/http --user $(id -u):$(id -g) -p 127.0.0.1:8000:8000 gitea.woggioni.net/woggioni/bugis:latest
```
### Run in docker with `nginx` and `plantUML` server
```bash
docker compose up --build
```

14
conf/nginx-bugis.conf Normal file
View File

@@ -0,0 +1,14 @@
server {
listen 8080;
http2 on;
server_name localhost;
location / {
proxy_pass http://granian:8000;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_read_timeout 60s;
}
}

3
conf/pip.conf Normal file
View File

@@ -0,0 +1,3 @@
[global]
index-url = https://gitea.woggioni.net/api/packages/woggioni/pypi/simple
extra-index-url = https://pypi.org/simple

46
docker-compose.yaml Normal file
View File

@@ -0,0 +1,46 @@
networks:
default:
external: false
ipam:
driver: default
config:
- subnet: 172.128.0.0/16
ip_range: 172.128.0.0/16
gateway: 172.128.0.254
services:
granian:
build:
context: .
user: $UID:$GID
restart: unless-stopped
# container_name: granian
environment:
PLANT_UML_SERVER_ADDRESS: http://plant_uml:8080
deploy:
resources:
limits:
cpus: "0.5"
memory: 512M
volumes:
- ${STATIC_ROOT}:/srv/http
plant_uml:
image: plantuml/plantuml-server:jetty
# container_name: plantUML
restart: unless-stopped
tmpfs: /tmp/jetty
deploy:
resources:
limits:
cpus: "4"
memory: 1G
nginx:
image: gitea.woggioni.net/woggioni/nginx:v1.27.2
# container_name: nginx
restart: unless-stopped
depends_on:
- granian
volumes:
- ./conf/nginx-bugis.conf:/etc/nginx/conf.d/bugis.conf:ro
ports:
- 127.0.0.1:80:8080

View File

@@ -1,15 +0,0 @@
FROM alpine:latest
MAINTAINER Oggioni Walter <oggioni.walter@gmail.com>
RUN apk update
RUN apk add python3 uwsgi uwsgi-python3
RUN mkdir /srv/http
VOLUME /srv/http
WORKDIR /srv/http
ADD md2html-*.whl /
RUN pip3 install /md2html-*.whl && rm /md2html-*.whl
ENTRYPOINT ["uwsgi"]
EXPOSE 1180/tcp
EXPOSE 1180/udp
USER nobody
CMD ["--plugin", "/usr/lib/uwsgi/python_plugin.so", "-s", ":1180", "-w", "md2html.uwsgi"]

View File

@@ -1,213 +0,0 @@
#!/usr/bin/env python3
import argparse
import hashlib
import sys
import threading
from http.server import ThreadingHTTPServer, BaseHTTPRequestHandler
from os.path import basename, dirname, abspath, join
from urllib.parse import urlparse
import markdown
STATIC_CACHE = {}
def load_from_cache(path):
global STATIC_CACHE
if path not in STATIC_CACHE:
with open(join(dirname(__file__), 'static') + path, 'r') as static_file:
STATIC_CACHE[path] = static_file.read()
return STATIC_CACHE[path]
def compile_html(mdfile=None, extensions=None, raw=None, **kwargs):
html = None
with mdfile and open(mdfile, 'r') or sys.stdin as instream:
html = markdown.markdown(instream.read(), extensions=extensions, output_format='html5')
if raw:
doc = html
else:
css = ' <style>%s\n%s\n </style>' % (
load_from_cache('/github-markdown.css'),
load_from_cache('/custom.css')
)
doc = load_from_cache('/template.html').format(content=html, script='', css=css)
return doc
class MarkdownHTTPServer(ThreadingHTTPServer):
def __init__(self, mdfile, extensions=(), handler=BaseHTTPRequestHandler, interface="127.0.0.1", port=8080):
import inotify
import inotify.adapters
import signal
self.stop = False
def sigint_handler(signum, frame):
self.stop = True
handlers = (sigint_handler, signal.getsignal(signal.SIGINT))
signal.signal(signal.SIGINT, lambda signum, frame: [handler(signum, frame) for handler in handlers])
self.mdfile = mdfile
self.extensions = extensions
self.condition_variable = threading.Condition()
self.hash = None
self.etag = None
def watch_file():
watcher = inotify.adapters.Inotify()
watcher.add_watch(dirname(abspath(self.mdfile)))
target_file = basename(self.mdfile)
while True:
if self.stop:
break
for event in watcher.event_gen(yield_nones=True, timeout_s=1):
if not event:
continue
(_, event_type, path, filename) = event
if filename == target_file and len(set(event_type).intersection(
{'IN_CLOSE_WRITE'})):
self.condition_variable.acquire()
if self.update_file_digest():
self.condition_variable.notify_all()
self.condition_variable.release()
file_watcher = threading.Thread(target=watch_file)
file_watcher.start()
super().__init__((interface, port), handler)
def update_file_digest(self):
md5 = hashlib.md5()
with open(self.mdfile, 'rb') as mdfile:
md5.update(mdfile.read())
digest = md5.digest()
if not self.hash or self.hash != digest:
self.hash = digest
self.etag = md5.hexdigest()
return True
else:
return False
class MarkdownRequestHandler(BaseHTTPRequestHandler):
status_map = {
200: "OK",
204: "No Content",
304: "Not Modified",
400: "Bad Request",
401: "Unauthorized",
404: "Not Found",
499: "Service Error",
500: "Internal Server Error",
501: "Not Implemented",
503: "Service Unavailable"
}
def answer(self, code, reply=None, content_type="text/plain",
headers=()):
output = self.wfile
if not reply:
reply = MarkdownRequestHandler.status_map[code]
try:
self.send_response(code, MarkdownRequestHandler.status_map[code])
for header in headers:
self.send_header(*header)
self.send_header("Content-Type", content_type)
self.send_header('Content-Length', len(reply))
self.end_headers()
output.write(reply.encode("UTF-8"))
output.flush()
except BrokenPipeError:
pass
def markdown_answer(self):
if not self.server.etag:
self.server.condition_variable.acquire()
self.server.update_file_digest()
self.server.condition_variable.release()
self.answer(200, headers=(('Etag', self.server.etag),),
reply=compile_html(mdfile=self.server.mdfile, extensions=self.server.extensions, raw=True),
content_type='text/html')
def do_GET(self):
path = urlparse(self.path)
if path.path == '/':
self.answer(200, reply=load_from_cache('/template.html').format(
content='',
script='<script src="/hot-reload.js", type="text/javascript"></script>',
css='<link rel="stylesheet" href="github-markdown.css">'
'<link rel="stylesheet" href="custom.css">'),
content_type='text/html')
elif path.path in {'/github-markdown.css', '/custom.css', '/hot-reload.js'}:
self.answer(200, load_from_cache(path.path), content_type='text/css')
elif path.path == '/markdown':
self.markdown_answer()
elif path.path == '/reload':
if 'If-None-Match' not in self.headers or self.headers['If-None-Match'] != self.server.etag:
self.markdown_answer()
else:
self.server.condition_variable.acquire()
self.server.condition_variable.wait(timeout=10)
self.server.condition_variable.release()
if self.server.stop:
self.answer(503)
elif self.headers['If-None-Match'] == self.server.etag:
self.answer(304)
else:
self.answer(200, headers=(('Etag', self.server.etag),),
reply=compile_html(mdfile=self.server.mdfile,
extensions=self.server.extensions,
raw=True),
content_type='text/html')
else:
self.answer(404)
def parse_args(args=None):
parser = argparse.ArgumentParser(description='Make a complete, styled HTML document from a Markdown file.')
parser.add_argument('mdfile', help='File to convert. Defaults to stdin.')
parser.add_argument('-o', '--out', help='Output file name. Defaults to stdout.')
parser.add_argument('-r', '--raw', action='store_true',
help='Just output a raw html fragment, as returned from the markdown module')
parser.add_argument('-e', '--extensions', nargs='+', default=['extra', 'smarty', 'tables'],
help='Activate specified markdown extensions (defaults to "extra smarty tables")')
try:
import inotify
import gevent
import signal
parser.add_argument('-w', '--watch', action='store_true',
help='Watch specified source file and rerun the compilation for every time it changes')
parser.add_argument('-p', '--port', default=5000, type=int,
help='Specify http server port (defaults to 5000)')
parser.add_argument('-i', '--interface', default='',
help='Specify http server listen interface (defaults to localhost)')
except ImportError:
pass
return parser.parse_args(args)
def write_html(out=None, **kwargs):
doc = compile_html(**kwargs)
with (out and open(out, 'w')) or sys.stdout as outstream:
outstream.write(doc)
def main(args=None):
args = parse_args(args)
if hasattr(args, 'watch') and args.watch:
server = MarkdownHTTPServer(args.mdfile,
extensions=args.extensions,
interface=args.interface,
port=args.port,
handler=MarkdownRequestHandler)
server.serve_forever()
else:
write_html(**vars(args))
if __name__ == '__main__':
main()

View File

@@ -1,19 +0,0 @@
function req(first) {
var xmlhttp = new XMLHttpRequest();
xmlhttp.onload = function() {
if (xmlhttp.status == 200) {
document.querySelector("article.markdown-body").innerHTML = xmlhttp.responseText;
} else if(xmlhttp.status == 304) {
} else {
console.log(xmlhttp.status, xmlhttp.statusText);
}
req(false);
};
xmlhttp.onerror = function() {
console.log(xmlhttp.status, xmlhttp.statusText);
setTimeout(req, 1000, false);
};
xmlhttp.open("GET", first ? "/markdown" : "/reload", true);
xmlhttp.send();
}
req(true);

View File

@@ -1,108 +0,0 @@
import logging
from os import getcwd, listdir
from os.path import exists, splitext, isfile, join, relpath, isdir, basename
from mimetypes import init as mimeinit, guess_type
import hashlib
from .md2html import compile_html
mimeinit()
log = logging.getLogger(__name__)
logging.basicConfig(level=logging.INFO)
cwd = getcwd()
def is_markdown(filepath):
_, ext = splitext(filepath)
return ext == ".md"
cache = dict()
def file_hash(filepath, bufsize=4096):
if bufsize <= 0:
raise ValueError("Buffer size must be greater than 0")
md5 = hashlib.md5()
with open(filepath, 'rb') as f:
while True:
buf = f.read(bufsize)
if len(buf) == 0:
break
md5.update(buf)
return md5.digest()
def application(env, start_response):
path = join(cwd, relpath(env['PATH_INFO'], '/'))
if exists(path):
if isfile(path):
if path not in cache:
digest = file_hash(path).hex()
cache[path] = digest
else:
digest = cache[path]
def parse_etag(etag):
if etag is None:
return
start = etag.find('"')
if start < 0:
return
end = etag.find('"', start + 1)
return etag[start + 1: end]
etag = parse_etag(env.get('HTTP_IF_NONE_MATCH'))
if etag and etag == digest:
start_response('304 Not Modified', [
('Etag', '"%s"' % digest),
('Cache-Control', 'no-cache, must-revalidate, max-age=86400'),
])
return []
elif is_markdown(path):
body = compile_html(path, ['extra', 'smarty', 'tables']).encode()
start_response('200 OK', [('Content-Type', 'text/html; charset=UTF-8'),
('Etag', '"%s"' % digest),
('Cache-Control', 'no-cache, must-revalidate, max-age=86400'),
])
return [body]
else:
def read_file(file_path):
buffer_size = 1024
with open(file_path, 'rb') as f:
while True:
result = f.read(buffer_size)
if len(result) == 0:
break
yield result
start_response('200 OK', [('Content-Type', guess_type(basename(path))[0] or 'application/octet-stream'),
('Etag', '"%s"' % digest),
('Cache-Control', 'no-cache, must-revalidate, max-age=86400'),
])
return read_file(path)
elif isdir(path):
body = directory_listing(env['PATH_INFO'], path).encode()
start_response('200 OK', [
('Content-Type', 'text/html; charset=UTF-8'),
])
return [body]
start_response('404 NOT_FOUND', [])
return []
def directory_listing(path_info, path):
title = "Directory listing for %s" % path_info
result = "<!DOCTYPE html><html><head>"
result += "<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\">"
result += "<title>" + title + "</title></head>"
result += "<body><h1>" + title + "</h1><hr>"
result += "<ul>"
if path_info != '/':
result += "<li><a href=\"../\"/>../</li>"
def ls(filter):
return (entry for entry in sorted(listdir(path)) if filter(join(path, entry)))
for entry in ls(isdir):
result += '<li><a href="' + entry + '/' + '"/>' + entry + '/' + '</li>'
for entry in ls(lambda entry: isfile(entry) and is_markdown(entry)):
result += '<li><a href="' + entry + '"/>' + entry + '</li>'
return result

65
pyproject.toml Normal file
View File

@@ -0,0 +1,65 @@
[build-system]
requires = ["setuptools>=61.0", "setuptools-scm>=8"]
build-backend = "setuptools.build_meta"
[project]
name = "bugis"
dynamic = ["version"]
authors = [
{ name="Walter Oggioni", email="oggioni.walter@gmail.com" },
]
description = "Markdown to HTML renderer"
readme = "README.md"
requires-python = ">=3.10"
classifiers = [
'Development Status :: 3 - Alpha',
'Topic :: Utilities',
'License :: OSI Approved :: MIT License',
'Intended Audience :: System Administrators',
'Intended Audience :: Developers',
'Environment :: Console',
'License :: OSI Approved :: MIT License',
'Programming Language :: Python :: 3',
]
dependencies = [
"Markdown",
"Pygments",
"watchdog",
"pwo",
"PyYAML",
"pygraphviz",
"aiofiles",
"httpx[http2]"
]
[project.optional-dependencies]
dev = [
"build", "granian", "mypy", "ipdb", "twine"
]
run = [
"granian"
]
[tool.setuptools.package-data]
bugis = ['static/*', 'default-conf/*']
[project.urls]
"Homepage" = "https://github.com/woggioni/bugis"
"Bug Tracker" = "https://github.com/woggioni/bugis/issues"
[tool.mypy]
python_version = "3.12"
disallow_untyped_defs = true
show_error_codes = true
no_implicit_optional = true
warn_return_any = true
warn_unused_ignores = true
exclude = ["scripts", "docs", "test"]
strict = true
[tool.setuptools_scm]
version_file = "src/bugis/_version.py"
[project.scripts]
bugis = "bugis.cli:main"

169
requirements-dev.txt Normal file
View File

@@ -0,0 +1,169 @@
#
# This file is autogenerated by pip-compile with Python 3.12
# by the following command:
#
# pip-compile --extra=dev --output-file=requirements-dev.txt pyproject.toml
#
--index-url https://gitea.woggioni.net/api/packages/woggioni/pypi/simple
--extra-index-url https://pypi.org/simple
aiofiles==24.1.0
# via bugis (pyproject.toml)
anyio==4.6.2.post1
# via httpx
asttokens==2.4.1
# via stack-data
build==1.2.2.post1
# via bugis (pyproject.toml)
certifi==2024.8.30
# via
# httpcore
# httpx
# requests
cffi==1.17.1
# via cryptography
charset-normalizer==3.4.0
# via requests
click==8.1.7
# via granian
cryptography==43.0.3
# via secretstorage
decorator==5.1.1
# via
# ipdb
# ipython
docutils==0.21.2
# via readme-renderer
executing==2.1.0
# via stack-data
granian==1.6.1
# via bugis (pyproject.toml)
h11==0.14.0
# via httpcore
h2==4.1.0
# via httpx
hpack==4.0.0
# via h2
httpcore==1.0.6
# via httpx
httpx[http2]==0.27.2
# via bugis (pyproject.toml)
hyperframe==6.0.1
# via h2
idna==3.10
# via
# anyio
# httpx
# requests
importlib-metadata==8.5.0
# via twine
ipdb==0.13.13
# via bugis (pyproject.toml)
ipython==8.28.0
# via ipdb
jaraco-classes==3.4.0
# via keyring
jaraco-context==6.0.1
# via keyring
jaraco-functools==4.1.0
# via keyring
jedi==0.19.1
# via ipython
jeepney==0.8.0
# via
# keyring
# secretstorage
keyring==25.4.1
# via twine
markdown==3.7
# via bugis (pyproject.toml)
markdown-it-py==3.0.0
# via rich
matplotlib-inline==0.1.7
# via ipython
mdurl==0.1.2
# via markdown-it-py
more-itertools==10.5.0
# via
# jaraco-classes
# jaraco-functools
mypy==1.13.0
# via bugis (pyproject.toml)
mypy-extensions==1.0.0
# via mypy
nh3==0.2.18
# via readme-renderer
packaging==24.1
# via build
parso==0.8.4
# via jedi
pexpect==4.9.0
# via ipython
pkginfo==1.10.0
# via twine
prompt-toolkit==3.0.48
# via ipython
ptyprocess==0.7.0
# via pexpect
pure-eval==0.2.3
# via stack-data
pwo==0.0.4
# via bugis (pyproject.toml)
pycparser==2.22
# via cffi
pygments==2.18.0
# via
# bugis (pyproject.toml)
# ipython
# readme-renderer
# rich
pygraphviz==1.14
# via bugis (pyproject.toml)
pyproject-hooks==1.2.0
# via build
pyyaml==6.0.2
# via bugis (pyproject.toml)
readme-renderer==44.0
# via twine
requests==2.32.3
# via
# requests-toolbelt
# twine
requests-toolbelt==1.0.0
# via twine
rfc3986==2.0.0
# via twine
rich==13.9.3
# via twine
secretstorage==3.3.3
# via keyring
six==1.16.0
# via asttokens
sniffio==1.3.1
# via
# anyio
# httpx
stack-data==0.6.3
# via ipython
traitlets==5.14.3
# via
# ipython
# matplotlib-inline
twine==5.1.1
# via bugis (pyproject.toml)
typing-extensions==4.12.2
# via
# mypy
# pwo
urllib3==2.2.3
# via
# requests
# twine
uvloop==0.21.0
# via granian
watchdog==5.0.3
# via bugis (pyproject.toml)
wcwidth==0.2.13
# via prompt-toolkit
zipp==3.20.2
# via importlib-metadata

57
requirements-run.txt Normal file
View File

@@ -0,0 +1,57 @@
#
# This file is autogenerated by pip-compile with Python 3.12
# by the following command:
#
# pip-compile --extra=run --output-file=requirements-run.txt pyproject.toml
#
--index-url https://gitea.woggioni.net/api/packages/woggioni/pypi/simple
--extra-index-url https://pypi.org/simple
aiofiles==24.1.0
# via bugis (pyproject.toml)
anyio==4.6.2.post1
# via httpx
certifi==2024.8.30
# via
# httpcore
# httpx
click==8.1.7
# via granian
granian==1.6.1
# via bugis (pyproject.toml)
h11==0.14.0
# via httpcore
h2==4.1.0
# via httpx
hpack==4.0.0
# via h2
httpcore==1.0.6
# via httpx
httpx[http2]==0.27.2
# via bugis (pyproject.toml)
hyperframe==6.0.1
# via h2
idna==3.10
# via
# anyio
# httpx
markdown==3.7
# via bugis (pyproject.toml)
pwo==0.0.4
# via bugis (pyproject.toml)
pygments==2.18.0
# via bugis (pyproject.toml)
pygraphviz==1.14
# via bugis (pyproject.toml)
pyyaml==6.0.2
# via bugis (pyproject.toml)
sniffio==1.3.1
# via
# anyio
# httpx
typing-extensions==4.12.2
# via pwo
uvloop==0.21.0
# via granian
watchdog==5.0.3
# via bugis (pyproject.toml)

51
requirements.txt Normal file
View File

@@ -0,0 +1,51 @@
#
# This file is autogenerated by pip-compile with Python 3.12
# by the following command:
#
# pip-compile --output-file=requirements.txt pyproject.toml
#
--index-url https://gitea.woggioni.net/api/packages/woggioni/pypi/simple
--extra-index-url https://pypi.org/simple
aiofiles==24.1.0
# via bugis (pyproject.toml)
anyio==4.6.2.post1
# via httpx
certifi==2024.8.30
# via
# httpcore
# httpx
h11==0.14.0
# via httpcore
h2==4.1.0
# via httpx
hpack==4.0.0
# via h2
httpcore==1.0.6
# via httpx
httpx[http2]==0.27.2
# via bugis (pyproject.toml)
hyperframe==6.0.1
# via h2
idna==3.10
# via
# anyio
# httpx
markdown==3.7
# via bugis (pyproject.toml)
pwo==0.0.4
# via bugis (pyproject.toml)
pygments==2.18.0
# via bugis (pyproject.toml)
pygraphviz==1.14
# via bugis (pyproject.toml)
pyyaml==6.0.2
# via bugis (pyproject.toml)
sniffio==1.3.1
# via
# anyio
# httpx
typing-extensions==4.12.2
# via pwo
watchdog==5.0.3
# via bugis (pyproject.toml)

View File

@@ -1,43 +0,0 @@
from os.path import join, dirname
from setuptools import setup, find_packages
def read(fname):
return open(join(dirname(__file__), fname)).read()
config = {
'name': "md2html",
'version': "0.2",
'author': "Walter Oggioni",
'author_email': "oggioni.walter@gmail.com",
'description': ("Various development utility scripts"),
'long_description': '',
'license': "MIT",
'keywords': "build",
'url': "https://github.com/oggio88/md2html",
'packages': ['md2html'],
'package_data': {
'md2html': ['static/*.html', 'static/*.css', 'static/*.js'],
},
'include_package_data': True,
'classifiers': [
'Development Status :: 3 - Alpha',
'Topic :: Utilities',
'License :: OSI Approved :: MIT License',
'Intended Audience :: System Administrators',
'Intended Audience :: Developers',
'Environment :: Console',
'License :: OSI Approved :: MIT License',
'Programming Language :: Python :: 3',
],
'install_requires': [
'markdown'
],
"entry_points": {
'console_scripts': [
'md2html=md2html.md2html:main',
],
}
}
setup(**config)

5
src/bugis/__main__.py Normal file
View File

@@ -0,0 +1,5 @@
import argparse
from cli import main
import sys
main(sys.argv[1:])

67
src/bugis/asgi.py Normal file
View File

@@ -0,0 +1,67 @@
import logging
from logging.config import dictConfig as configure_logging
from yaml import safe_load
from .configuration import Configuration
with open(Configuration.instance.logging_configuration_file, 'r') as input_file:
conf = safe_load(input_file)
configure_logging(conf)
from pwo import Maybe
from .server import Server
from asyncio import get_running_loop
from .asgi_utils import decode_headers
from typing import Optional, Awaitable, Callable, Any, Mapping
log = logging.getLogger('access')
log.propagate = False
_server: Optional[Server] = None
async def application(scope, receive, send : Callable[[Mapping[str, Any]], Awaitable[None]]):
global _server
if scope['type'] == 'lifespan':
while True:
message = await receive()
if message['type'] == 'lifespan.startup':
_server = Server(loop=get_running_loop(), prefix=None)
await send({'type': 'lifespan.startup.complete'})
elif message['type'] == 'lifespan.shutdown':
await _server.stop()
await send({'type': 'lifespan.shutdown.complete'})
else:
def maybe_log(evt):
d = {
'response_headers': (Maybe.of_nullable(evt.get('headers'))
.map(decode_headers)
.or_none()),
'status': evt['status']
}
log.info(None, extra=dict(**{k : v for k, v in d.items() if k is not None}, **scope))
def wrapped_send(*args, **kwargs):
result = send(*args, **kwargs)
(Maybe.of(args)
.filter(lambda it: len(it) > 0)
.map(lambda it: it[0])
.filter(lambda it: it.get('type') == 'http.response.start')
.if_present(maybe_log))
return result
pathsend = (Maybe.of_nullable(scope.get('extensions'))
.map(lambda it: it.get("http.response.pathsend"))
.is_present)
await _server.handle_request(
scope['method'],
scope['path'],
Maybe.of([header[1] for header in scope['headers'] if header[0].decode().lower() == 'if-none-match'])
.filter(lambda it: len(it) > 0)
.map(lambda it: it[0])
.map(lambda it: it.decode())
.or_else(None),
Maybe.of_nullable(scope.get('query_string', None)).map(lambda it: it.decode()).or_else(None),
wrapped_send,
pathsend
)

26
src/bugis/asgi_utils.py Normal file
View File

@@ -0,0 +1,26 @@
from typing import Tuple, Dict, Sequence
type StrOrStrings = (str, Sequence[str])
def decode_headers(headers: Sequence[Tuple[bytes, bytes]]) -> Dict[str, Sequence[str]]:
result = dict()
for key, value in headers:
if isinstance(key, bytes):
key = key.decode()
if isinstance(value, bytes):
value = value.decode()
l = result.setdefault(key, list())
l.append(value)
return {
k: tuple(v) for k, v in result.items()
}
def encode_headers(headers: Dict[str, StrOrStrings]) -> Tuple[Tuple[bytes, bytes], ...]:
result = []
for key, value in headers.items():
if isinstance(value, str):
result.append((key.encode(), value.encode()))
elif isinstance(value, Sequence):
for single_value in value:
result.append((key.encode(), single_value.encode()))
return tuple(result)

View File

@@ -0,0 +1,94 @@
import asyncio
from asyncio import Queue, AbstractEventLoop, Future, Task, gather
from logging import getLogger, Logger
from pathlib import Path
from pwo import TopicManager, Subscriber
from watchdog.events import FileMovedEvent, FileClosedEvent, FileCreatedEvent, FileModifiedEvent
from watchdog.events import FileSystemEventHandler, FileSystemEvent, PatternMatchingEventHandler
from watchdog.observers import Observer
log: Logger = getLogger(__name__)
class _EventHandler(FileSystemEventHandler):
_queue: Queue
_loop: AbstractEventLoop
def __init__(self, queue: Queue, loop: AbstractEventLoop,
*args, **kwargs):
self._loop = loop
self._queue = queue
super(*args, **kwargs)
def on_created(self, event: FileSystemEvent) -> None:
self._loop.call_soon_threadsafe(self._queue.put_nowait, event)
def on_modified(self, event: FileSystemEvent) -> None:
self._loop.call_soon_threadsafe(self._queue.put_nowait, event)
observer = Observer()
def watch(path: Path, queue: Queue, loop: AbstractEventLoop,
recursive: bool = False) -> None:
"""Watch a directory for changes."""
handler = _EventHandler(queue, loop)
observer.schedule(handler, str(path), recursive=recursive)
observer.start()
observer.join()
loop.call_soon_threadsafe(queue.put_nowait, None)
class FileWatcher(PatternMatchingEventHandler):
_topic_manager: TopicManager
_loop: AbstractEventLoop
_topic_manager_loop: Task
_running_tasks : Future
def __init__(self, path):
super().__init__(patterns=['*.md'],
ignore_patterns=None,
ignore_directories=False,
case_sensitive=True)
self._observer: Observer = Observer()
self._observer.schedule(self, path=path, recursive=True)
self._loop = asyncio.get_running_loop()
self._topic_manager = TopicManager(self._loop)
self._running_tasks = gather(
self._loop.run_in_executor(None, self._observer.start),
self._loop.create_task(self._topic_manager.process_events())
)
async def stop(self) -> None:
def _observer_stop():
self._observer.stop()
self._observer.join()
self._topic_manager.post_event(None)
await self._loop.run_in_executor(None, _observer_stop)
await self._running_tasks
def subscribe(self, path: str) -> Subscriber:
return self._topic_manager.subscribe(path)
def on_any_event(self, event: FileSystemEvent) -> None:
what = "directory" if event.is_directory else "file"
def post_event(path):
self._topic_manager.post_event(path)
if isinstance(event, FileClosedEvent):
log.debug("Closed %s: %s", what, event.src_path)
# update_subscriptions()
elif isinstance(event, FileMovedEvent):
log.debug("Moved %s: %s to %s", what, event.src_path, event.dest_path)
post_event(event.dest_path)
elif isinstance(event, FileCreatedEvent):
log.debug("Created %s: %s", what, event.src_path)
post_event(event.src_path)
elif isinstance(event, FileModifiedEvent):
log.debug("Modified %s: %s", what, event.src_path)
post_event(event.src_path)

41
src/bugis/cli.py Normal file
View File

@@ -0,0 +1,41 @@
import argparse
from dataclasses import asdict
from os import environ
from pathlib import Path
from typing import Optional, Sequence
import yaml
from granian import Granian
from pwo import Maybe
from .configuration import Configuration
def main(args: Optional[Sequence[str]] = None):
parser = argparse.ArgumentParser(description="A simple CLI program to render Markdown files")
default_configuration_file = (Maybe.of(environ.get('XDG_CONFIG_HOME'))
.map(lambda it: Path(it))
.map(lambda it: it / 'bugis' / 'bugis.yaml')
.or_else_get(lambda: Path(environ.get('HOME')) / '.config' / 'bugis' / 'bugis.yaml')
.filter(Path.exists)
.or_else(None)
)
parser.add_argument(
'-c',
'--configuration',
help='Path to the configuration file',
default=default_configuration_file,
type=Path,
)
args = parser.parse_args(args)
def parse(configuration: Path):
with open(configuration, 'r') as f:
return Configuration.from_dict(yaml.safe_load(f))
conf = Maybe.of_nullable(args.configuration).map(parse).or_else(Configuration.instance)
Granian(
"bugis.asgi:application",
**asdict(conf.granian)
).serve()

140
src/bugis/configuration.py Normal file
View File

@@ -0,0 +1,140 @@
from os import environ
from pathlib import Path
from dataclasses import dataclass, field, asdict
from granian.constants import Loops, Interfaces, ThreadModes, HTTPModes, StrEnum
from granian.log import LogLevels
from granian.http import HTTP1Settings, HTTP2Settings
from typing import Optional, Sequence, Dict, Any
from pwo import classproperty, Maybe
from yaml import add_representer, SafeDumper, SafeLoader
@dataclass(frozen=True)
class Configuration:
logging_configuration_file: str = environ.get("LOGGING_CONFIGURATION_FILE",
Path(__file__).parent / 'default-conf' / 'logging.yaml')
plant_uml_server_address: str = environ.get('PLANT_UML_SERVER_ADDRESS', None)
@classproperty
def instance(cls) -> 'Configuration':
return Configuration()
@dataclass(frozen=True)
class GranianConfiguration:
address: str = '127.0.0.1'
port: int = 8000
interface: str = Interfaces.ASGI
workers: int = 1
threads: int = 1
blocking_threads: Optional[int] = None
threading_mode: ThreadModes = ThreadModes.workers
loop: Loops = Loops.auto
loop_opt: bool = False
http: HTTPModes = HTTPModes.auto
websockets: bool = True
backlog: int = 1024
backpressure: Optional[int] = None
http1_settings: Optional[HTTP1Settings] = None
http2_settings: Optional[HTTP2Settings] = None
log_enabled: bool = True
log_level: LogLevels = LogLevels.info
log_dictconfig: Optional[Dict[str, Any]] = None
log_access: bool = False
log_access_format: Optional[str] = None
ssl_cert: Optional[Path] = None
ssl_key: Optional[Path] = None
ssl_key_password: Optional[str] = None
url_path_prefix: Optional[str] = None
respawn_failed_workers: bool = False
respawn_interval: float = 3.5
workers_lifetime: Optional[int] = None
factory: bool = False
reload: bool = False
reload_paths: Optional[Sequence[Path]] = None
reload_ignore_dirs: Optional[Sequence[str]] = None
reload_ignore_patterns: Optional[Sequence[str]] = None
reload_ignore_paths: Optional[Sequence[Path]] = None
process_name: Optional[str] = None
pid_file: Optional[Path] = None
@staticmethod
def from_dict(d) -> 'Configuration.GranianConfiguration':
return Configuration.GranianConfiguration(**{k: v for k, v in dict(
address=d.get('address', None),
port=d.get('port', None),
interface=Maybe.of_nullable(d.get('interface')).map(lambda it: Interfaces(it)).or_else(None),
workers=d.get('workers', None),
threads=d.get('threads', None),
blocking_threads=d.get('blocking_threads', None),
threading_mode=Maybe.of_nullable(d.get('threading_modes')).map(lambda it: ThreadModes(it)).or_else(None),
loop=Maybe.of_nullable(d.get('loop')).map(lambda it: Loops(it)).or_else(None),
loop_opt=d.get('loop_opt', None),
http=Maybe.of_nullable(d.get('http')).map(lambda it: HTTPModes(it)).or_else(None),
websockets=d.get('websockets', None),
backlog=d.get('backlog', None),
backpressure=d.get('backpressure', None),
http1_settings=Maybe.of_nullable(d.get('http1_settings')).map(lambda it: HTTP1Settings(**it)).or_else(None),
http2_settings=Maybe.of_nullable(d.get('http2_settings')).map(lambda it: HTTP2Settings(**it)).or_else(None),
log_enabled=d.get('log_enabled', None),
log_level=Maybe.of_nullable(d.get('log_level')).map(lambda it: LogLevels(it)).or_else(None),
# log_dictconfig: Optional[Dict[str, Any]] = None,
log_access=d.get('log_access', None),
log_access_format=d.get('log_access_format', None),
ssl_cert=d.get('ssl_cert', None),
ssl_key=d.get('ssl_key', None),
ssl_key_password=d.get('ssl_key_password', None),
url_path_prefix=d.get('url_path_prefix', None),
respawn_failed_workers=d.get('respawn_failed_workers', None),
respawn_interval=d.get('respawn_interval', None),
workers_lifetime=d.get('workers_lifetime', None),
factory=d.get('factory', None),
reload=d.get('reload', None),
reload_paths=d.get('reload_paths', None),
reload_ignore_dirs=d.get('reload_ignore_dirs', None),
reload_ignore_patterns=d.get('reload_ignore_patterns', None),
reload_ignore_paths=d.get('reload_ignore_paths', None),
process_name=d.get('process_name', None),
pid_file=d.get('pid_file', None),
).items() if v is not None})
granian: GranianConfiguration = GranianConfiguration()
@staticmethod
def from_dict(d) -> 'Configuration':
return Configuration(
**{k: v for k, v in dict(
logging_configuration_file=d.get('logging_configuration_file', None),
plant_uml_server_address=d.get('plant_uml_server_address', None),
granian=Maybe.of_nullable(d.get('granian'))
.map(Configuration.GranianConfiguration.from_dict)
.or_else(None)
).items() if v is not None
}
)
def to_yaml(self, stream):
dumper = SafeDumper(stream)
dumper.add_representer(Configuration, lambda dumper, conf: dumper.represent_dict(asdict(conf)))
dumper.add_representer(Configuration.GranianConfiguration,
lambda dumper, conf: dumper.represent_dict(asdict(conf)))
dumper.add_representer(LogLevels, lambda dumper, level: dumper.represent_str(level.lower()))
dumper.add_multi_representer(Path, lambda dumper, path: dumper.represent_str(str(path)))
dumper.add_multi_representer(StrEnum, lambda dumper, str_enum: dumper.represent_str(str(str_enum)))
dumper.add_representer(HTTP1Settings, lambda dumper, settings: dumper.represent_dict(vars(settings)))
dumper.add_representer(HTTP2Settings, lambda dumper, settings: dumper.represent_dict(vars(settings)))
try:
dumper.open()
dumper.represent(Configuration.instance)
dumper.close()
finally:
dumper.dispose()
@staticmethod
def from_yaml(stream) -> 'Configuration':
loader = SafeLoader(stream)
try:
conf = loader.get_single_data()
return Configuration.from_dict(conf)
finally:
loader.dispose()

View File

@@ -0,0 +1,35 @@
version: 1
disable_existing_loggers: True
handlers:
console:
class : logging.StreamHandler
formatter: default
level : INFO
stream : ext://sys.stderr
access:
class : logging.StreamHandler
formatter: request
level : INFO
stream : ext://sys.stderr
formatters:
brief:
format: '%(message)s'
default:
format: '{asctime} [{levelname}] ({processName:s}/{threadName:s}) - {name} - {message}'
style: '{'
datefmt: '%Y-%m-%d %H:%M:%S'
request:
format: '{asctime} {client[0]}:{client[1]} HTTP/{http_version} {method} {path} - {status}'
style: '{'
datefmt: '%Y-%m-%d %H:%M:%S'
loggers:
root:
handlers: [console]
level: DEBUG
access:
handlers: [access]
level: INFO
watchdog.observers.inotify_buffer:
level: INFO
MARKDOWN:
level: INFO

61
src/bugis/md2html.py Normal file
View File

@@ -0,0 +1,61 @@
import sys
from os.path import dirname, join, relpath
from time import time
from typing import Optional, TYPE_CHECKING
from aiofiles import open as async_open
from asyncio import AbstractEventLoop
import markdown
import logging
if TYPE_CHECKING:
from _typeshed import StrOrBytesPath
STATIC_RESOURCES: set[str] = {
'/github-markdown.css',
'/custom.css',
'/hot-reload.js',
'/pygment.css',
'/markdown.svg'
}
STATIC_CACHE: dict[str, tuple[str, float]] = {}
MARDOWN_EXTENSIONS = ['extra', 'smarty', 'tables', 'codehilite']
logger = logging.getLogger(__name__)
async def load_from_cache(path) -> tuple[str, float]:
global STATIC_CACHE
if path not in STATIC_CACHE:
async with async_open(join(dirname(__file__), 'static') + path, 'r') as static_file:
STATIC_CACHE[path] = (await static_file.read(), time())
return STATIC_CACHE[path]
async def compile_html(url_path,
mdfile: 'StrOrBytesPath',
loop: AbstractEventLoop,
prefix: Optional['StrOrBytesPath'] = None,
extensions: Optional[list[str]] = None,
raw: bool = False) -> str:
async with mdfile and async_open(mdfile, 'r') or sys.stdin as instream:
src = await instream.read()
def render(source) -> str:
logger.debug("Starting markdown rendering for file '%s'", mdfile)
result = markdown.markdown(source, extensions=extensions, output_format='html')
logger.debug("Markdown rendering for file '%s' completed", mdfile)
return result
html = await loop.run_in_executor(None, render, src)
if raw:
doc = html
else:
parent = dirname(url_path)
prefix = prefix or relpath('/', start=parent)
script = f'<script src="{prefix}/hot-reload.js", type="text/javascript" defer="true"></script>'
css = f'<link rel="icon" type="image/x-icon" href="{prefix}/markdown.svg">'
for css_file in ('github-markdown.css', 'pygment.css', 'custom.css'):
css += f' <link rel="stylesheet" href="{prefix}/{css_file}">'
doc = (await load_from_cache('/template.html'))[0].format(content=html, script=script, css=css)
return doc

19
src/bugis/plantuml.py Normal file
View File

@@ -0,0 +1,19 @@
from typing import TYPE_CHECKING
from aiofiles import open as async_open
from .configuration import Configuration
if TYPE_CHECKING:
from _typeshed import StrOrBytesPath
from httpx import AsyncClient, URL
from typing import Callable, Awaitable
from urllib.parse import urljoin
chunk_size = 0x10000
async def render_plant_uml(client: AsyncClient, path: 'StrOrBytesPath', send : Callable[[bytes], Awaitable[None]]):
url = URL(urljoin(Configuration.instance.plant_uml_server_address, 'svg'))
async with async_open(path, 'rb') as file:
source = await file.read()
response = await client.post(url, content=source)
response.raise_for_status()
async for chunk in response.aiter_bytes(chunk_size=chunk_size):
await send(chunk)

407
src/bugis/server.py Normal file
View File

@@ -0,0 +1,407 @@
import hashlib
import logging
from asyncio import AbstractEventLoop
from asyncio import Future
from io import BytesIO
from mimetypes import init as mimeinit, guess_type
from os import getcwd
from os.path import join, normpath, splitext, relpath, basename
from typing import Callable, TYPE_CHECKING, Optional, Awaitable, AsyncGenerator, Any, Mapping
import pygraphviz as pgv
from aiofiles import open as async_open
from aiofiles.base import AiofilesContextManager
from aiofiles.os import listdir
from aiofiles.ospath import exists, isdir, isfile, getmtime
from aiofiles.threadpool.binary import AsyncBufferedReader
from httpx import AsyncClient
from pwo import Maybe
from .asgi_utils import encode_headers
from .async_watchdog import FileWatcher
from .md2html import compile_html, load_from_cache, STATIC_RESOURCES, MARDOWN_EXTENSIONS
from .plantuml import render_plant_uml
if TYPE_CHECKING:
from _typeshed import StrOrBytesPath
mimeinit()
cwd: 'StrOrBytesPath' = getcwd()
def completed_future[T](result: T) -> Future[T]:
future = Future()
future.set_result(result)
return future
def has_extension(filepath, extension):
_, ext = splitext(filepath)
return ext == extension
def is_markdown(filepath):
return has_extension(filepath, ".md")
def is_dotfile(filepath):
return has_extension(filepath, ".dot")
def is_plant_uml(filepath):
return has_extension(filepath, ".puml")
logger = logging.getLogger(__name__)
class Server:
root_dir: 'StrOrBytesPath'
prefix: Optional['StrOrBytesPath']
_loop: AbstractEventLoop
_client: AsyncClient
def __init__(self,
root_dir: 'StrOrBytesPath' = getcwd(),
prefix: Optional['StrOrBytesPath'] = None,
loop: AbstractEventLoop = None):
self.root_dir = root_dir
self.cache = dict['StrOrBytesPath', tuple[str, float]]()
self.file_watcher = FileWatcher(cwd)
self.prefix = prefix and normpath(f'{prefix.decode()}')
self._loop = loop
self._client = AsyncClient()
async def handle_request(self,
method: str,
url_path: str,
etag: Optional[str],
query_string: Optional[str],
send: Callable[[Mapping[str, Any]], Awaitable[None]],
pathsend: bool = False
):
if method != 'GET':
await send({
'type': 'http.response.start',
'status': 405
})
await send({
'type': 'http.response.body',
'body': b'',
})
return
relative_path = relpath(url_path, start=self.prefix or '/')
url_path: 'StrOrBytesPath' = normpath(join('/', relative_path))
path: 'StrOrBytesPath' = join(self.root_dir, relative_path)
if url_path in STATIC_RESOURCES:
content, mtime = await load_from_cache(url_path)
content = content.encode()
etag, digest = await self.compute_etag_and_digest(
etag,
url_path,
lambda: AiofilesContextManager(
completed_future(AsyncBufferedReader(BytesIO(content), loop=self._loop, executor=None))),
lambda: completed_future(mtime)
)
if etag and etag == digest:
await self.not_modified(send, digest, 'must-revalidate, max-age=86400')
return
elif content:
mime_type = guess_type(basename(url_path))[0] or 'application/octet-stream'
await send({
'type': 'http.response.start',
'status': 200,
'headers': encode_headers({
'content-type': f'{mime_type}; charset=UTF-8',
'etag': f'W/{digest}',
'Cache-Control': 'must-revalidate, max-age=86400',
})
})
await send({
'type': 'http.response.body',
'body': content
})
return
elif await exists(path):
if await isfile(path):
etag, digest = await self.compute_etag_and_digest(
etag,
path,
lambda: async_open(path, 'rb'),
lambda: getmtime(path)
)
logger.debug('Etag: %s, digest: %s', etag, digest)
if etag and etag == digest:
if is_markdown(path) and query_string == 'reload':
subscription = self.file_watcher.subscribe(path)
try:
has_changed = await subscription.wait(30)
if has_changed:
_, digest = await self.compute_etag_and_digest(
etag,
path,
lambda: async_open(path, 'rb'),
lambda: getmtime(path)
)
if etag != digest:
if await exists(path) and await isfile(path):
await self.render_markdown(url_path, path, True, digest, send)
return
else:
await self.not_found(send)
return
finally:
subscription.unsubscribe()
await self.not_modified(send, digest)
elif is_markdown(path):
raw = query_string == 'reload'
await self.render_markdown(url_path, path, raw, digest, send)
elif is_dotfile(path):
def render_graphviz(filepath: StrOrBytesPath) -> bytes:
logger.debug("Starting Graphviz rendering for file '%s'", filepath)
graph = pgv.AGraph(filepath)
result = graph.draw(None, format="svg", prog="dot")
logger.debug("Completed Graphviz rendering for file '%s'", filepath)
return result
body = await self._loop.run_in_executor(None, render_graphviz, path)
await send({
'type': 'http.response.start',
'status': 200,
'headers': encode_headers({
'Content-Type': 'image/svg+xml; charset=UTF-8',
'Etag': f'W/{digest}',
'Cache-Control': 'no-cache',
})
})
await send({
'type': 'http.response.body',
'body': body
})
elif is_plant_uml(path):
logger.debug("Starting PlantUML rendering for file '%s'", path)
logger.debug("Completed PlantUML rendering for file '%s'", path)
await send({
'type': 'http.response.start',
'status': 200,
'headers': encode_headers({
'Content-Type': 'image/svg+xml; charset=UTF-8',
'Etag': f'W/{digest}',
'Cache-Control': 'no-cache'
})
})
await render_plant_uml(self._client, path, lambda chunk: send({
'type': 'http.response.body',
'body': chunk,
'more_body': True
}))
await send({
'type': 'http.response.body',
'body': '',
'more_body': False
})
else:
async def read_file(file_path, buffer_size=0x10000):
async with async_open(file_path, 'rb') as f:
while True:
result = await f.read(buffer_size)
if len(result) == 0:
break
yield result
await send({
'type': 'http.response.start',
'status': 200,
'headers': encode_headers({
'Content-Type': guess_type(basename(path))[0] or 'application/octet-stream',
'Etag': f'W/{digest}',
'Cache-Control': 'no-cache'
})
})
if pathsend:
await send({
'type': 'http.response.pathsend',
'path': path
})
else:
async for chunk in read_file(path):
await send({
'type': 'http.response.body',
'body': chunk,
'more_body': True
})
await send({
'type': 'http.response.body',
'body': b'',
'more_body': False
})
elif await isdir(path):
body = (await self.directory_listing(url_path, path)).encode()
await send({
'type': 'http.response.start',
'status': 200,
'headers': encode_headers({
'Content-Type': 'text/html; charset=UTF-8',
})
})
await send({
'type': 'http.response.body',
'body': body
})
else:
await self.not_found(send)
@staticmethod
async def stream_hash(source: AsyncBufferedReader, bufsize=0x10000) -> bytes:
if bufsize <= 0:
raise ValueError("Buffer size must be greater than 0")
md5 = hashlib.md5()
while True:
buf = await source.read(bufsize)
if len(buf) == 0:
break
md5.update(buf)
return md5.digest()
@staticmethod
async def file_hash(filepath, bufsize=0x10000) -> bytes:
if bufsize <= 0:
raise ValueError("Buffer size must be greater than 0")
md5 = hashlib.md5()
async with async_open(filepath, 'rb') as f:
while True:
buf = await f.read(bufsize)
if len(buf) == 0:
break
md5.update(buf)
return md5.digest()
@staticmethod
def parse_etag(etag: str) -> Optional[str]:
def skip_weak_marker(s):
if s.startswith('W/'):
return s[2:]
else:
return s
return (
Maybe.of_nullable(etag)
.map(skip_weak_marker)
.or_else(None)
)
async def compute_etag_and_digest(
self,
etag_header: str,
path: str,
stream_source: Callable[[], AiofilesContextManager[AsyncBufferedReader]],
mtime_supplier: Callable[[], Awaitable[float]]
) -> tuple[str, str]:
cache_result = self.cache.get(path)
_mtime: Optional[float] = None
async def mtime() -> float:
nonlocal _mtime
if not _mtime:
_mtime = await mtime_supplier()
return _mtime
if not cache_result or cache_result[1] < await mtime():
async with stream_source() as stream:
digest = (await Server.stream_hash(stream)).hex()
self.cache[path] = digest, await mtime()
else:
digest = cache_result[0]
etag = Server.parse_etag(etag_header)
return etag, digest
async def render_markdown(self,
url_path: 'StrOrBytesPath',
path: str,
raw: bool,
digest: str,
send) -> None:
body = await compile_html(
url_path,
path,
self._loop,
self.prefix,
MARDOWN_EXTENSIONS,
raw=raw
)
await send({
'type': 'http.response.start',
'status': 200,
'headers': encode_headers({
'Content-Type': 'text/html; charset=UTF-8',
'Etag': f'W/{digest}',
'Cache-Control': 'no-cache',
})
})
await send({
'type': 'http.response.body',
'body': body.encode()
})
@staticmethod
async def not_modified(send, digest: str, cache_control: str = 'no-cache') -> []:
await send({
'type': 'http.response.start',
'status': 304,
'headers': encode_headers({
'Etag': f'W/{digest}',
'Cache-Control': cache_control
})
})
await send({
'type': 'http.response.body',
})
return
@staticmethod
async def not_found(send) -> None:
await send({
'type': 'http.response.start',
'status': 404
})
await send({
'type': 'http.response.body',
})
async def directory_listing(self, path_info, path) -> str:
icon_path = join(self.prefix or '', 'markdown.svg')
title = "Directory listing for %s" % path_info
result = "<!DOCTYPE html><html><head>"
result += f'<link rel="icon" type="image/x-icon" href="{icon_path}">'
result += "<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\">"
result += "<title>" + title + "</title></head>"
result += "<body><h1>" + title + "</h1><hr>"
result += "<ul>"
if path_info != '/':
result += "<li><a href=\"../\"/>../</li>"
async def ls(filter: Callable[[str], Awaitable[bool]]) -> AsyncGenerator[str, Any]:
async def result():
for entry in sorted(await listdir(path)):
if await filter(join(path, entry)):
yield entry
return result()
async for entry in await ls(isdir):
result += '<li><a href="' + entry + '/' + '"/>' + entry + '/' + '</li>'
async def file_filter(entry: str) -> bool:
return await isfile(entry) and is_markdown(entry)
async for entry in await ls(file_filter):
result += '<li><a href="' + entry + '"/>' + entry + '</li>'
return result
async def stop(self):
await self.file_watcher.stop()
await self._client.aclose()

View File

@@ -0,0 +1,21 @@
function req(first, previousETag) {
const minInterval = 2000;
const start = new Date().getTime();
const xmlhttp = new XMLHttpRequest();
xmlhttp.onload = function() {
const eTag = xmlhttp.getResponseHeader("Etag")
if (xmlhttp.status == 200 && eTag !== previousETag) {
document.querySelector("article.markdown-body").innerHTML = xmlhttp.responseText;
}
const nextCall = Math.min(minInterval, Math.max(0, minInterval - (new Date().getTime() - start)));
setTimeout(req, nextCall, false, eTag);
};
xmlhttp.onerror = function() {
console.log(xmlhttp.status, xmlhttp.statusText);
setTimeout(req, minInterval, false, previousETag);
};
xmlhttp.open("GET", location.pathname + "?reload", true);
xmlhttp.send();
}
req(true, null);

View File

@@ -0,0 +1 @@
<svg viewBox="0 0 512 512" xmlns="http://www.w3.org/2000/svg"><rect fill="#fff" height="512" rx="15%" width="512"/><path d="m410 366h-308c-14 0-26-12-26-26v-170c0-14 12-26 26-26h307c14 0 26 12 26 26v170c0 14-11 26-25 26zm-308-204c-4 0-9 4-9 9v170c0 5 4 9 9 9h307c5 0 9-4 9-9v-171c0-5-4-9-9-9h-307zm26 153v-119h34l34 43 34-43h35v118h-34v-68l-34 43-34-43v68zm216 0-52-57h34v-61h34v60h34z"/></svg>

After

Width:  |  Height:  |  Size: 394 B

View File

@@ -0,0 +1,75 @@
pre { line-height: 125%; }
td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; }
span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; }
td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; }
span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; }
.codehilite .hll { background-color: #ffffcc }
.codehilite { background: #f8f8f8; }
.codehilite .c { color: #3D7B7B; font-style: italic } /* Comment */
.codehilite .err { border: 1px solid #FF0000 } /* Error */
.codehilite .k { color: #008000; font-weight: bold } /* Keyword */
.codehilite .o { color: #666666 } /* Operator */
.codehilite .ch { color: #3D7B7B; font-style: italic } /* Comment.Hashbang */
.codehilite .cm { color: #3D7B7B; font-style: italic } /* Comment.Multiline */
.codehilite .cp { color: #9C6500 } /* Comment.Preproc */
.codehilite .cpf { color: #3D7B7B; font-style: italic } /* Comment.PreprocFile */
.codehilite .c1 { color: #3D7B7B; font-style: italic } /* Comment.Single */
.codehilite .cs { color: #3D7B7B; font-style: italic } /* Comment.Special */
.codehilite .gd { color: #A00000 } /* Generic.Deleted */
.codehilite .ge { font-style: italic } /* Generic.Emph */
.codehilite .ges { font-weight: bold; font-style: italic } /* Generic.EmphStrong */
.codehilite .gr { color: #E40000 } /* Generic.Error */
.codehilite .gh { color: #000080; font-weight: bold } /* Generic.Heading */
.codehilite .gi { color: #008400 } /* Generic.Inserted */
.codehilite .go { color: #717171 } /* Generic.Output */
.codehilite .gp { color: #000080; font-weight: bold } /* Generic.Prompt */
.codehilite .gs { font-weight: bold } /* Generic.Strong */
.codehilite .gu { color: #800080; font-weight: bold } /* Generic.Subheading */
.codehilite .gt { color: #0044DD } /* Generic.Traceback */
.codehilite .kc { color: #008000; font-weight: bold } /* Keyword.Constant */
.codehilite .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */
.codehilite .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */
.codehilite .kp { color: #008000 } /* Keyword.Pseudo */
.codehilite .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */
.codehilite .kt { color: #B00040 } /* Keyword.Type */
.codehilite .m { color: #666666 } /* Literal.Number */
.codehilite .s { color: #BA2121 } /* Literal.String */
.codehilite .na { color: #687822 } /* Name.Attribute */
.codehilite .nb { color: #008000 } /* Name.Builtin */
.codehilite .nc { color: #0000FF; font-weight: bold } /* Name.Class */
.codehilite .no { color: #880000 } /* Name.Constant */
.codehilite .nd { color: #AA22FF } /* Name.Decorator */
.codehilite .ni { color: #717171; font-weight: bold } /* Name.Entity */
.codehilite .ne { color: #CB3F38; font-weight: bold } /* Name.Exception */
.codehilite .nf { color: #0000FF } /* Name.Function */
.codehilite .nl { color: #767600 } /* Name.Label */
.codehilite .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */
.codehilite .nt { color: #008000; font-weight: bold } /* Name.Tag */
.codehilite .nv { color: #19177C } /* Name.Variable */
.codehilite .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */
.codehilite .w { color: #bbbbbb } /* Text.Whitespace */
.codehilite .mb { color: #666666 } /* Literal.Number.Bin */
.codehilite .mf { color: #666666 } /* Literal.Number.Float */
.codehilite .mh { color: #666666 } /* Literal.Number.Hex */
.codehilite .mi { color: #666666 } /* Literal.Number.Integer */
.codehilite .mo { color: #666666 } /* Literal.Number.Oct */
.codehilite .sa { color: #BA2121 } /* Literal.String.Affix */
.codehilite .sb { color: #BA2121 } /* Literal.String.Backtick */
.codehilite .sc { color: #BA2121 } /* Literal.String.Char */
.codehilite .dl { color: #BA2121 } /* Literal.String.Delimiter */
.codehilite .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */
.codehilite .s2 { color: #BA2121 } /* Literal.String.Double */
.codehilite .se { color: #AA5D1F; font-weight: bold } /* Literal.String.Escape */
.codehilite .sh { color: #BA2121 } /* Literal.String.Heredoc */
.codehilite .si { color: #A45A77; font-weight: bold } /* Literal.String.Interpol */
.codehilite .sx { color: #008000 } /* Literal.String.Other */
.codehilite .sr { color: #A45A77 } /* Literal.String.Regex */
.codehilite .s1 { color: #BA2121 } /* Literal.String.Single */
.codehilite .ss { color: #19177C } /* Literal.String.Symbol */
.codehilite .bp { color: #008000 } /* Name.Builtin.Pseudo */
.codehilite .fm { color: #0000FF } /* Name.Function.Magic */
.codehilite .vc { color: #19177C } /* Name.Variable.Class */
.codehilite .vg { color: #19177C } /* Name.Variable.Global */
.codehilite .vi { color: #19177C } /* Name.Variable.Instance */
.codehilite .vm { color: #19177C } /* Name.Variable.Magic */
.codehilite .il { color: #666666 } /* Literal.Number.Integer.Long */

View File

@@ -3,9 +3,9 @@
<head>
<meta name="viewport" content="width=device-width, initial-scale=1">
{css}
{script}
</head>
<body>
{script}
<article class="markdown-body">
{content}
</article>

115
test/README.md Normal file
View File

@@ -0,0 +1,115 @@
## Overview
Envelope is a simple Gradle plugin that allows you to create an executable jar file
that includes all runtime dependencies and can be executed with a simple
```bash
java -jar my-app.jar
```
It supports JPMS, embedded system properties, Java agents, extra folders to be added to classpath.
### Usage
Declare the plugin in your build's `settings.gradle` like this
```groovy
pluginManagement {
repositories {
maven {
url = 'https://woggioni.net/mvn/'
}
}
plugins {
id "net.woggioni.gradle.envelope" version "2023.09.25"
}
}
```
Then add it to a project's `build.gradle`
```groovy
plugins {
id 'net.woggioni.gradle.envelope'
}
envelopeJar {
mainClass = 'your.main.Class'
}
```
The plugin adds 2 tasks to your project:
- `envelopeJar` of type `net.woggioni.gradle.envelope.EnvelopeJarTask` that creates the executable jar in the project's libraries folder
- `envelopeRun` of type `org.gradle.api.tasks.JavaExec` which launches the jar created by the `envelopeJar` task
### Configuration
`EnvelopeJarTask` has several properties useful for configuration purposes:
###### mainClass
This string property sets the class that will be searched for the `main` method to start the application
###### mainModule
When this string property is set, the jar file will be started in JPMS mode (if running on Java 9+) and
this module will be searched for the main class, if the `mainClass` is not set the main class specified
in the module descriptor will be loaded instead
###### systemProperties
This is a map that contains Java system properties that will be set before your application starts
###### extraClasspath
This is a list of strings representing filesystem paths that will be added to the classpath (if running in classpath mode)
or to the module path (if running in JPMS mode) when the application starts.
Relative paths and interpolation with Java System properties and environmental variables are supported:
e.g.
This looks for a `plugin` folder in the user's home directory
```
${env:HOME}/plugins
```
Same using Java system properties instead
```
${sys:user.home}/plugins
```
###### javaAgent
This is a method accepting 2 strings, the first is the Java agent classname and the second one is the java agent arguments.
It can be invoked multiple times to setup multiple java agents for the same JAR file.
All the java agents will be invoked before the application startup.
### Example
```groovy
plugins {
id 'net.woggioni.gradle.envelope'
}
envelopeJar {
mainClass = 'your.main.Class'
mainModule = 'your.main.module'
systemProperties = [
'some.property' : 'Some value'
]
extraClasspath = ["plugins"]
javaAgent('your.java.agent.Class', 'optional agent arguments')
}
```
### Limitations
This plugin requires Gradle >= 6.0 and Java >=0 8 to build the executable jar file.
The assembled envelope jar requires and Java >= 8 to run, if only `mainClass` is specified,
if both `mainModule` and `mainClass` are specified the generated jar file will (try to) run in classpath mode on Java 8
and in JPMS mode on Java > 8.
<object data="example.dot"/>

27
test/example.dot Normal file
View File

@@ -0,0 +1,27 @@
digraph D {
subgraph cluster_p {
label = "Parent";
subgraph cluster_c1 {
label = "Child one";
a;
subgraph cluster_gc_1 {
label = "Grand-Child one";
b;
}
subgraph cluster_gc_2 {
label = "Grand-Child two";
c;
d;
}
}
subgraph cluster_c2 {
label = "Child two";
e;
}
}
}