update to nextra 4

This commit is contained in:
2025-09-06 19:19:45 +02:00
parent d17a565130
commit 7864c38371
48 changed files with 998 additions and 500 deletions

8
content/_meta.ts Normal file
View File

@@ -0,0 +1,8 @@
export default {
index: 'Intro',
cli: 'CLI',
git: 'Git',
dev_ops: 'Dev Ops',
latex: 'LaTeX',
web_dev: 'Web Development',
}

21
content/admin/windows.md Normal file
View File

@@ -0,0 +1,21 @@
---
tags:
- windows
- administration
---
# Windows
## List of tools
- [De-Bloat](https://github.com/Raphire/Win11Debloat)
- [Microsoft Activation Scripts](https://github.com/massgravel/Microsoft-Activation-Scripts)
- [Uniget UI](https://github.com/marticliment/UnigetUI)
## Utils
### Add local user
`Win + R` then `netplwiz`

View File

@@ -0,0 +1,7 @@
# Convert documents
```bash
find . -name "*.odt" | xargs -I % pandoc % -o %.md
```
You can add the `-p` flag to `xargs` to confirm before executing, useful for testing if what your are doing is actually correct.

View File

@@ -0,0 +1,7 @@
# Delete empty directories
```bash
find . -type d -empty -delete
```
This recursively finds and deletes empty directory.

7
content/cli/diff.md Executable file
View File

@@ -0,0 +1,7 @@
# See difference between files / directories
```bash
diff -ry dir1 dir2
```
The `-r` flag is to recurse into folders. The `-y` flag creates a side by side view, generally easier to read. `-q` Would only show what filenames changed.

View File

@@ -0,0 +1,23 @@
---
tags:
- cli
- macos
- format
---
# Format a Drive
## macOS
Some times Disk Utility cannot format a whole drive for some reason.
```
# Check devices
diskutil list
# To exfat
diskutil eraseDisk EXFAT "NAME" GPT /dev/diskN
# To fat32, with mbr
diskutil eraseDisk FAT32 "NAME" MBR /dev/diskN
```

15
content/cli/rename.md Executable file
View File

@@ -0,0 +1,15 @@
---
tags:
- rename
- regex
---
# `rename`
`rename` is a command to rename based on regex.
```bash
rename 's/.odt.md/.md/' *.odt.md
```
> Use the `-n` flag to dry run

4
content/dev_ops/_meta.ts Normal file
View File

@@ -0,0 +1,4 @@
export default {
'github-actions': 'Github Actions',
hosting: 'Hosting',
}

View File

@@ -0,0 +1,116 @@
---
tags:
- Github Actions
- DRY
---
# Composite Actions
Often we reuse `steps` inside our different github actions. As we generally want to follow [DRY](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself) principles (and are lazy), which means every duplicated step has potential for improvement.
> There is also a [good guide/tutorial by James Wallis](https://wallis.dev/blog/composite-github-actions), which this is mainly inspired by.
## Composite Actions vs Reusable Workflows
Within Github actions there are two ways to achieve that: **Composite Actions** and **Reusable Workflows**. Here is a [good comparison by cardinalby](https://cardinalby.github.io/blog/post/github-actions/dry-reusing-code-in-github-actions/).
## Key Points of Composite Actions
- Can live in the same repository, but can also be outsourced into it's own.
- Share the same filesystem -> no build artifacts need to be passed around.
- Secrets cannot be accessed directly, need to be passed.
- Each action has to have it's own directory with an `action.yaml` file inside it.
- When executing raw commands we need to specify the `shell` we are running in.
## Example
The example will show how to extract a part of a github action to a composite action. In this case: building some LaTeX files.
```
.github/
├── actions
│ └── build
│ └── action.yaml
└── workflows
├── preview.yml
└── release.yml
```
```yaml
name: 'Latex Builder'
description: 'Checkout and build LaTeX files.'
inputs:
# As we cannot access secrets directly, they must be passed
github-token:
description: 'GitHub token for authentication.'
required: true
runs:
using: 'composite' # This is the magic
steps:
- uses: actions/cache@v3
name: Tectonic Cache
with:
path: ~/.cache/Tectonic
key: ${{ runner.os }}-tectonic-${{ hashFiles('**/*.tex') }}
restore-keys: |
${{ runner.os }}-tectonic-
- uses: wtfjoke/setup-tectonic@v2
with:
github-token: ${{ inputs.github-token }}
- name: Run Tectonic
run: make tectonic
shell: bash # This would not be required in the normal action file
```
```yaml
name: 'Preview'
on:
# ...
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: ./.github/actions/build
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
- name: Upload PDFs
uses: actions/upload-artifact@v2
with:
name: PDFs
path: '*.pdf'
```
```yaml
name: 'Release'
on:
# ...
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: ./.github/actions/build
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
- name: Release
uses: ncipollo/release-action@v1
with:
allowUpdates: true
artifacts: '*.pdf'
token: ${{ secrets.GITHUB_TOKEN }}
```
## Gotchas
- If we use a local composite action, the `actions/checkout@v3` step cannot be inside the composite action, as the step itself is inside the repository, so it does not exist yet in the run.

View File

@@ -0,0 +1,63 @@
---
tags:
- Github Actions
- Pages
- Static Site
---
# Github Pages with Actions
Publish static sites to Github Pages using Actions.
## Example
The example uses `docs` as the built folder containing the static site.
```yaml
name: Docs
on:
push:
branches:
- main
workflow_dispatch:
permissions:
contents: read
pages: write
id-token: write
concurrency:
group: 'pages'
cancel-in-progress: true
jobs:
deploy:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
# Build some static assets
- uses: actions/configure-pages@v3
- uses: actions/upload-pages-artifact@v1
with:
path: './docs'
- id: deployment
uses: actions/deploy-pages@v1
```
## Path prefix
Note that we require a path to be set as github pages are published as: `https://<username>.github.io/<repo>/`
### Vite
For vite you can set it with the [base option](https://vitejs.dev/config/shared-options.html#base).
```bash
vite build --emptyOutDir --base=./
```

View File

@@ -0,0 +1,83 @@
---
tags:
- LaTeX
- Github Actions
- CD
- Pipeline
- Tectonic
---
# Building LaTeX in Github Actions
This pipeline uses [tectonic](https://tectonic-typesetting.github.io) as the build system for LaTeX. Covered here are:
- Custom fonts
- Pipeline
- Upload generated files as artifacts
## Fonts
If we are using custom fonts, we need to make them available first. This means checking them into the repo (or downloading them remotely). In this case I chose storing them as LFS files.
In most Linux systems you can install custom fonts under `~/.fonts`.
```
./fonts/
├── Open_Sans.zip
├── Roboto_Mono.zip
└── install.sh
```
```sh
#!/bin/sh
TARGET=~/.fonts
mkdir -p $TARGET
unzip -o -d "$TARGET/roboto_mono" "./fonts/Roboto_Mono.zip"
unzip -o -d "$TARGET/open_sans" "./fonts/Open_Sans.zip"
```
## Pipeline
```yaml
name: 'Build LaTeX'
on:
pull_request:
push:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
# Optional Cache of downloaded Tex packages
- uses: actions/cache@v3
name: Tectonic Cache
with:
path: ~/.cache/Tectonic
key: ${{ runner.os }}-tectonic-${{ hashFiles('**/*.tex') }}
restore-keys: |
${{ runner.os }}-tectonic-
# Install tectonic
- uses: wtfjoke/setup-tectonic@v2
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
- name: Install fonts
run: ./fonts/install.sh
- name: Build
run: tectonic src/main.tex
- name: Upload PDFs
uses: actions/upload-artifact@v2
with:
name: PDFs
path: '*.pdf'
```

View File

@@ -0,0 +1,63 @@
# Publish Docker images
This is how to publish a docker image simultaneously to the official Docker and Github registries.
**Supported features**
- **x86** and **arm** images
- Push to **both** registries.
- Semver tag labeling
We will assume that our image is called `foo/bar`, so our username is `foo` and the actual package is `bar`
```yaml
name: Publish Docker image
on:
release:
types: [published]
jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
install: true
- name: Docker Labels
id: meta
uses: docker/metadata-action@v5
with:
images: |
foo/bar
ghcr.io/${{ github.repository }}
# This assumes your repository is also github.com/foo/bar
# You could also use ghcr.io/foo/some-package as long as you are the user/org "foo"
tags: |
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
- name: Log in to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Log in to the Container registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push
uses: docker/build-push-action@v5
with:
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta.outputs.tags }}
```

View File

@@ -0,0 +1,14 @@
---
tags:
- docker registry
- hosting
- authentication
---
# Setup you own authenticated Docker Registry
## Resources
- https://earthly.dev/blog/private-docker-registry/
- https://www.digitalocean.com/community/tutorials/how-to-set-up-a-private-docker-registry-on-ubuntu-20-04
- https://github.com/docker/get-involved/blob/90c9470fd66c9318fec9c6f0914cb70fa87b9bf9/content/en/docs/CommunityLeaders/EventHandbooks/Docker101/registry/_index.md?plain=1#L203

View File

@@ -0,0 +1,61 @@
# Imgproxy with caching
A simple docker compose file that enables caching of the transformed [imgproxy](https://github.com/imgproxy/imgproxy) responses powered by nginx.
```yaml
version: '3.8'
volumes:
cache:
services:
img:
image: darthsim/imgproxy
environment:
# Required for nginx
IMGPROXY_BIND: 0.0.0.0:80
# Security
IMGPROXY_MAX_SRC_RESOLUTION: 100
IMGPROXY_ALLOWED_SOURCES: https://images.unsplash.com/,https://images.pexels.com/
# Transforms
IMGPROXY_ENFORCE_WEBP: true
IMGPROXY_ENFORCE_AVIF: true
IMGPROXY_ONLY_PRESETS: true
IMGPROXY_PRESETS: default=resizing_type:fit,250=size:250:250,500=size:500:500,1000=size:1000:1000,1500=size:1500:1500,2000=size:2000:2000
proxy:
image: nginx
ports:
- 80:80
volumes:
- ./proxy.conf:/etc/nginx/conf.d/default.conf:ro
- cache:/tmp
```
```
# proxy.conf
# Set cache to 30 days, 1GB.
# Only use the uri as the cache key, as it's the only input for imageproxy.
proxy_cache_path /tmp levels=1:2 keys_zone=backcache:8m max_size=1g inactive=30d;
proxy_cache_key "$uri";
proxy_cache_valid 200 302 30d;
server
{
listen 80;
server_name _;
location /
{
proxy_pass_request_headers off;
proxy_set_header HOST $host;
proxy_set_header Accept $http_accept;
proxy_pass http://img;
proxy_cache backcache;
}
}
```

View File

@@ -0,0 +1,227 @@
# Outline
[Outline](https://www.getoutline.com/) does not make it suuuper easy to not pay for their hosted version. So a few things are a bit rough. Here the [official docs](https://wiki.generaloutline.com/s/hosting/doc/hosting-outline-nipGaCRBDu).
1. Copy `docker-compose.yaml` and `.env`
2. Fill in missing values
3. Manually create a bucket called `wiki` in the minio dashboard.
```yaml
version: '3.8'
networks:
proxy:
external: true
services:
outline:
image: outlinewiki/outline
restart: unless-stopped
env_file: .env
command: sh -c "yarn db:migrate --env production-ssl-disabled && yarn start"
depends_on:
- db
- redis
- storage
networks:
- default
- proxy
labels:
- traefik.enable=true
- traefik.http.routers.outline.rule=Host(`example.org`)
- traefik.http.routers.outline.entrypoints=secure
- traefik.http.routers.outline.tls.certresolver=cf
redis:
restart: unless-stopped
image: redis
db:
image: postgres:15-alpine
restart: unless-stopped
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
# PGSSLMODE: disable
POSTGRES_USER: user
POSTGRES_PASSWORD: pass
POSTGRES_DB: outline
storage:
image: minio/minio
restart: unless-stopped
command: server /data --console-address ":80"
volumes:
- ./data/s3:/data
environment:
- MINIO_ROOT_USER=user
- MINIO_ROOT_PASSWORD=pass
- MINIO_DOMAIN=s3.example.org
networks:
- proxy
labels:
- traefik.enable=true
- traefik.http.routers.s3.rule=Host(`s3.example.org`)
- traefik.http.routers.s3.entrypoints=secure
- traefik.http.routers.s3.tls.certresolver=cf
- traefik.http.routers.s3.service=s3-service
- traefik.http.services.s3-service.loadbalancer.server.port=9000
- traefik.http.routers.s3-dash.rule=Host(`s3-dash.example.org`)
- traefik.http.routers.s3-dash.entrypoints=secure
- traefik.http.routers.s3-dash.tls.certresolver=cf
- traefik.http.routers.s3-dash.service=s3-dash-service
- traefik.http.services.s3-dash-service.loadbalancer.server.port=80
```
```env
# https://github.com/outline/outline/blob/main/.env.sample
# REQUIRED
NODE_ENV=production
SECRET_KEY=
UTILS_SECRET=
DATABASE_URL=postgres://user:pass@db:5432/outline
PGSSLMODE=disable
REDIS_URL=redis://redis:6379
URL=https://example.org
PORT=3000
COLLABORATION_URL=
AWS_ACCESS_KEY_ID=user
AWS_SECRET_ACCESS_KEY=pass
AWS_S3_ACCELERATE_URL=https://s3.example.org/wiki
AWS_S3_UPLOAD_BUCKET_URL=https://s3.example.org/wiki
AWS_S3_UPLOAD_BUCKET_NAME=wiki
AWS_S3_FORCE_PATH_STYLE=false
# AUTHENTICATION
# Third party signin credentials, at least ONE OF EITHER Google, Slack,
# or Microsoft is required for a working installation or you'll have no sign-in
# options.
# To configure Slack auth, you'll need to create an Application at
# => https://api.slack.com/apps
#
# When configuring the Client ID, add a redirect URL under "OAuth & Permissions":
# https://<URL>/auth/slack.callback
SLACK_CLIENT_ID=
SLACK_CLIENT_SECRET=
# To configure Google auth, you'll need to create an OAuth Client ID at
# => https://console.cloud.google.com/apis/credentials
#
# When configuring the Client ID, add an Authorized redirect URI:
# https://<URL>/auth/google.callback
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
# To configure Microsoft/Azure auth, you'll need to create an OAuth Client. See
# the guide for details on setting up your Azure App:
# => https://wiki.generaloutline.com/share/dfa77e56-d4d2-4b51-8ff8-84ea6608faa4
AZURE_CLIENT_ID=
AZURE_CLIENT_SECRET=
AZURE_RESOURCE_APP_ID=
# To configure generic OIDC auth, you'll need some kind of identity provider.
# See documentation for whichever IdP you use to acquire the following info:
# Redirect URI is https://<URL>/auth/oidc.callback
OIDC_CLIENT_ID=
OIDC_CLIENT_SECRET=
OIDC_AUTH_URI=
OIDC_TOKEN_URI=
OIDC_USERINFO_URI=
# Specify which claims to derive user information from
# Supports any valid JSON path with the JWT payload
OIDC_USERNAME_CLAIM=preferred_username
# Display name for OIDC authentication
OIDC_DISPLAY_NAME=OpenID
# Space separated auth scopes.
OIDC_SCOPES=openid profile email
# OPTIONAL
# Base64 encoded private key and certificate for HTTPS termination. This is only
# required if you do not use an external reverse proxy. See documentation:
# https://wiki.generaloutline.com/share/1c922644-40d8-41fe-98f9-df2b67239d45
SSL_KEY=
SSL_CERT=
# If using a Cloudfront/Cloudflare distribution or similar it can be set below.
# This will cause paths to javascript, stylesheets, and images to be updated to
# the hostname defined in CDN_URL. In your CDN configuration the origin server
# should be set to the same as URL.
CDN_URL=
# Auto-redirect to https in production. The default is true but you may set to
# false if you can be sure that SSL is terminated at an external loadbalancer.
FORCE_HTTPS=false
# Have the installation check for updates by sending anonymized statistics to
# the maintainers
ENABLE_UPDATES=true
# How many processes should be spawned. As a reasonable rule divide your servers
# available memory by 512 for a rough estimate
WEB_CONCURRENCY=1
# Override the maximum size of document imports, could be required if you have
# especially large Word documents with embedded imagery
MAXIMUM_IMPORT_SIZE=5120000
# You can remove this line if your reverse proxy already logs incoming http
# requests and this ends up being duplicative
#DEBUG=http
# For a complete Slack integration with search and posting to channels the
# following configs are also needed, some more details
# => https://wiki.generaloutline.com/share/be25efd1-b3ef-4450-b8e5-c4a4fc11e02a
#
SLACK_VERIFICATION_TOKEN=your_token
SLACK_APP_ID=A0XXXXXXX
SLACK_MESSAGE_ACTIONS=true
# Optionally enable google analytics to track pageviews in the knowledge base
GOOGLE_ANALYTICS_ID=
# Optionally enable Sentry (sentry.io) to track errors and performance,
# and optionally add a Sentry proxy tunnel for bypassing ad blockers in the UI:
# https://docs.sentry.io/platforms/javascript/troubleshooting/#using-the-tunnel-option)
SENTRY_DSN=
SENTRY_TUNNEL=
# To support sending outgoing transactional emails such as "document updated" or
# "you've been invited" you'll need to provide authentication for an SMTP server
SMTP_HOST=
SMTP_PORT=
SMTP_USERNAME=
SMTP_PASSWORD=
SMTP_FROM_EMAIL=
SMTP_REPLY_EMAIL=
SMTP_TLS_CIPHERS=
SMTP_SECURE=true
# The default interface language. See translate.getoutline.com for a list of
# available language codes and their rough percentage translated.
DEFAULT_LANGUAGE=en_US
# Optionally enable rate limiter at application web server
RATE_LIMITER_ENABLED=true
# Configure default throttling parameters for rate limiter
RATE_LIMITER_REQUESTS=1000
RATE_LIMITER_DURATION_WINDOW=60
```

View File

@@ -0,0 +1,50 @@
---
tags:
- docker
- vpn
- transmission
- torrent
---
# Dockerised Transmission over VPN
This setup allows to have a VPN server running, for downloading all your Linux ISOs over a VPN.
This works by using the amazing gluetun container and giving it a name `container_name:vpn` and referencing that name in all the containers where we want to go through a VPN by setting `network_mode: "container:vpn"`.
The two containers don't have to be in the same docker-compose file.
All the traffic is then routed thought the VPN container, where also the ports then are set.
Many vpn providers are supported, just look at the gluetun docs.
```yaml
version: '3.8'
services:
vpn:
image: qmcgaw/gluetun
container_name: vpn
restart: unless-stopped
cap_add:
- NET_ADMIN
ports:
- 9091:9091
environment:
- VPN_SERVICE_PROVIDER=nordvpn
- SERVER_REGIONS=Switzerland
- OPENVPN_USER=
- OPENVPN_PASSWORD=
transmission:
image: lscr.io/linuxserver/transmission:latest
restart: unless-stopped
network_mode: 'container:vpn'
environment:
- PUID=1000
- PGID=1000
- TZ=Europe/London
volumes:
- ./data/config:/config
- ./data/source:/watch
- /media/storage/dl:/downloads
```

17
content/git/clean.md Normal file
View File

@@ -0,0 +1,17 @@
---
tags:
- git
- clean
- gitignore
- delete
---
# Delete all files mentioned by `.gitignore`
This command is useful if you want to reset a repository to it's checked out state.
```bash
git clean -Xdf
```
[Original SO Link](https://unix.stackexchange.com/a/542735)

View File

@@ -0,0 +1,28 @@
# Clear history
This removes all commits from the past.
```bash
# Checkout
git checkout --orphan latest_branch
# Add all the files
git add -A
# Commit the changes
git commit -am "commit message"
# Delete the branch
git branch -D main
# Rename the current branch to main
git branch -m main
# Finally, force update your repository
git push -f origin main
# Optionally clear local caches
git gc --aggressive --prune=all
```
https://stackoverflow.com/a/26000395

View File

@@ -0,0 +1,11 @@
# Remove files from repository
How to remove files from the repository, without deleting them locally. [Original SO](https://stackoverflow.com/a/1143800)
```bash
# File
git rm --cached file_to_remove.txt
# Dir
git rm --cached -r directory_to_remove
```

View File

@@ -0,0 +1,20 @@
---
tags:
- git
- branch
- clean
- delete
---
# Delete all local branches that are already merged.
This command is useful if you have a buch of local branches that you don't need anymore.
```bash
git branch --merged | grep -v \* | xargs git branch -D
```
[Original SO Link](https://stackoverflow.com/a/10610669)

View File

@@ -0,0 +1,9 @@
# Remove secrets after being pushed
If you accidentally pushed a secret or some file that should not be publicly available in your git repo, there are a few ways. My personal fav is [BFG](https://rtyley.github.io/bfg-repo-cleaner/).
> `--no-blob-protection` also modifies you latest commit, by default that is turned off.
```bash
bfg -D "*.txt" --no-blob-protection
```

View File

@@ -0,0 +1,11 @@
# Reset Files
How to reset files from another branch.
```sh
# New way
git restore my/file.md
# Old way
git checkout origin/HEAD -- my/file.md
```

View File

@@ -0,0 +1,16 @@
# Revert branch to commit
Revert a branch to a certain commit, discarding newer ones.
```bash
# Specific commit
git reset --hard a1d6424
# Commits back
git reset --hard HEAD@{3}
```
```bash
# Push
git push -f
```

5
content/index.md Normal file
View File

@@ -0,0 +1,5 @@
# Memoir
This is supposed to be a collection of tips, templates, notes, whatever learning that I think might be useful in the future again. The aim is to provide a place where I can lookup stuff I've done before.
The aim is not to provide detailed explanations of everything, but rather have a quick way to remind myself how something works/is setup/etc.

12
content/latex.md Executable file
View File

@@ -0,0 +1,12 @@
# LaTeX
This is a collection of different LaTeX snippets for different use-cases.
## Building
I exclusively [Tectonic](https://tectonic-typesetting.github.io/en-US/) as build engine, as it is a modern alternative to the "OGs".
Most importantly, it supports lazy loading of dependencies, so the install size is tiny (in comparison).
## Pipeline
To build Latex in a GitHub Action, see [here](dev_ops/github-actions/latex)

14
content/latex/acronyms.md Executable file
View File

@@ -0,0 +1,14 @@
# Acronyms
```latex
\usepackage{acro}
\DeclareAcronym{dtn}{
short = DTN ,
long = Delay / Disruption Tollerant Networking ,
}
\ac{tcp}
\printacronyms
```

34
content/latex/bibliography.md Executable file
View File

@@ -0,0 +1,34 @@
# Bibliography
```latex
\usepackage[backend=bibtex8, style=ieee]{biblatex}
\addbibresource{db.bib}
\cite{jedari2018survey}
\medskip
\printbibliography
```
```latex
@article{jedari2018survey,
title = {A survey on human-centric communications in non-cooperative wireless relay networks},
author = {Jedari, Behrouz and Xia, Feng and Ning, Zhaolong},
journal = {IEEE Communications Surveys \& Tutorials},
volume = {20},
number = {2},
pages = {914--944},
year = {2018},
publisher = {IEEE}
}
```
For the backend `biber` should be the one used if versions compatiblity is not an issue
- [Styles](https://www.overleaf.com/learn/latex/Biblatex_citation_styles#Citation_styles)
- [Intro](https://www.overleaf.com/learn/latex/Bibliography_management_with_biblatex#Introduction)
### RFC
- [https://datatracker.ietf.org/doc/html/draft-carpenter-rfc-citation-recs-01#section-5.2](https://datatracker.ietf.org/doc/html/draft-carpenter-rfc-citation-recs-01#section-5.2)
- [https://notesofaprogrammer.blogspot.com/2014/11/bibtex-entries-for-ietf-rfcs-and.html](https://notesofaprogrammer.blogspot.com/2014/11/bibtex-entries-for-ietf-rfcs-and.html)

13
content/latex/code.md Executable file
View File

@@ -0,0 +1,13 @@
# Code
```latex
% Inline
Version: \verb|1.2.3|
% Block
\begin{verbatim*}
Text enclosed inside \texttt{verbatim} environment
is printed directly
and all \LaTeX{} commands are ignored.
\end{verbatim*}
```

11
content/latex/footnotes.md Executable file
View File

@@ -0,0 +1,11 @@
# Footnotes
```latex
% Simple
Something\footnote{This is a footnote}
% With link
Kubernetes\footnote{\url{https://kubernetes.io/}}
```
[https://www.overleaf.com/learn/latex/Footnotes](https://www.overleaf.com/learn/latex/Footnotes#Introduction_to_LaTeX.27s_main_footnote_commands)

19
content/latex/images.md Executable file
View File

@@ -0,0 +1,19 @@
# Images
```latex
\usepackage{graphicx}
% Relative to .tex file
\graphicspath{ {../images/} }
\begin{figure}[h]
\label{fig:cat}
\caption{Miaaauu}
\centering
\includegraphics[width=0.6\textwidth]{cat.png}
\end{figure}
Some cat here! \ref{fig:cat}
```
[https://www.overleaf.com/learn/latex/Inserting_Images](https://www.overleaf.com/learn/latex/Inserting_Images)

13
content/latex/links.md Executable file
View File

@@ -0,0 +1,13 @@
# Links
```latex
\usepackage{hyperref}
\href{http://www.overleaf.com}{Something Linky}
\url{http://www.overleaf.com}
```
Also required for linked TOC.
[https://www.overleaf.com/learn/latex/Hyperlinks#Linking_web_addresses](https://www.overleaf.com/learn/latex/Hyperlinks#Linking_web_addresses)

17
content/latex/lists.md Executable file
View File

@@ -0,0 +1,17 @@
# Lists
```latex
% Items
\begin{itemize}
\item a
\item b
\end{itemize}
% Numbered
\begin{enumerate}
\item One
\item Two
\end{enumerate}
```
[https://www.overleaf.com/learn/latex/Lists](https://www.overleaf.com/learn/latex/Lists)

35
content/latex/tables.md Executable file
View File

@@ -0,0 +1,35 @@
# Tables
```latex
\begin{tabular}{ l|r }
\label{item:dtn-simulators-chosen}
Name & Language \\
\hline
The One \cite{sim-theone} & \verb|Java| \\
OPS \cite{sim-ops} & \verb|C++| \\
ns3-dtn-bit \cite{sim-ns3} & \verb|C++| \\
dtnsim \cite{sim-dtnsim} & \verb|Python| \\
DTN \cite{sim-dtn} & \verb|C#| \\
\end{tabular}
```
```latex
begin{table}[h!]
\centering
\begin{tabular}{|c c c c|}
\hline
Col1 & Col2 & Col2 & Col3 \\
\hline\hline
1 & 6 & 87837 & 787 \\
2 & 7 & 78 & 5415 \\
3 & 545 & 778 & 7507 \\
4 & 545 & 18744 & 7560 \\
5 & 88 & 788 & 6344 \\
\hline
\end{tabular}
\caption{Table to test captions and labels.}
\label{table:1}
\end{table}
```
[https://www.overleaf.com/learn/latex/Tables#Tables_with_fixed_length](https://www.overleaf.com/learn/latex/Tables#Tables_with_fixed_length)

33
content/latex/tu-dresden.md Executable file
View File

@@ -0,0 +1,33 @@
# TU Dresden
```latex
\documentclass{tudscrartcl}
\iftutex
\usepackage{fontspec}
\else
\usepackage[T1]{fontenc}
\fi
\begin{document}
% Title
\faculty{Faculty of Computer Science}
\institute{Institute of Systems Architecture}
\chair{Chair of Computer Networks}
\extraheadline{Source: \href{https://github.com/cupcakearmy/master-thesis/}{github.com/cupcakearmy/master-thesis/}}
\author{Niccolo Borgioli}
\date{\today}
\title{Comparison of DTN simulators}
\maketitle
\tableofcontents
\section{Introduction}
Foo bar
\end{document}
```
[Docs](https://mirror.foobar.to/CTAN/macros/latex/contrib/tudscr/doc/tudscr.pdf)

View File

@@ -0,0 +1,82 @@
---
tags:
- tailwind
- theme
- css variables
- dark mode
---
# Tailwind themes with CSS Variables
There are many tailwind theme plugins, tried few and landed on`tailwindcss-themer`.
## Setup
Two files are required:
1. `pnpm i -D tailwindcss-themer`
1. `tailwind.config.js`
1. `app.postcss`
> Technically, we don't need to use CSS variables, but I like to, as I can export them automatically from Figma, or whatever design tool. Without CSS Variables, you could just define the colors in `tailwind.config.js` without touching css files.
```js
// tailwind.config.js
const themer = require('tailwindcss-themer')
/** @type {import('tailwindcss').Config}*/
const config = {
theme: {
// ...
},
plugins: [
themer({
defaultTheme: {
extend: {
colors: {
primary: 'var(--colors-cyan-500)',
secondary: 'var(--colors-yellow-500)',
surface: 'var(--colors-gray-100)',
text: 'var(--colors-gray-900)',
},
},
},
themes: [
{
name: 'darker',
mediaQuery: '@media (prefers-color-scheme: dark)',
extend: {
colors: {
primary: 'var(--colors-cyan-700)',
secondary: 'var(--colors-yellow-500)',
surface: 'var(--colors-gray-900)',
text: 'var(--colors-gray-100)',
},
},
},
],
}),
],
}
module.exports = config
```
```css
/* app.postcss */
@tailwind base;
@tailwind components;
@tailwind utilities;
:root {
/* colors */
--colors-cyan-500: rgb(164, 189, 245);
--colors-cyan-700: #0d398c;
--colors-gray-100: rgb(255, 255, 255);
--colors-gray-900: rgb(1, 1, 1);
--colors-yellow-500: rgb(233, 246, 82);
}
```