From 75d19fa5d9fe0ecd9330b124aebd31916b327b0e Mon Sep 17 00:00:00 2001 From: Niccolo Borgioli Date: Sun, 1 Dec 2024 17:36:36 +0100 Subject: [PATCH] cleanup --- ...to-directus-for-gatsby-or-sapper-as-cms.md | 53 ++-- .../automate-github-releases-with-drone.md | 60 ++-- .../blog/be-your-own-tiny-image-cdn.md | 16 +- .../blog/create-a-qr-code-for-google-drive.md | 82 ----- ...to-bring-your-neural-network-to-the-web.md | 283 +++++++++--------- src/content/blog/how-to-search-in-the-jam.md | 27 +- ...ud-from-heaven-to-the-depths-of-seafile.md | 22 +- src/content/blog/matomo-vs-ublock-origin.md | 45 +-- ...itor-your-self-hosted-services-for-free.md | 46 +-- src/content/blog/rust-in-python-made-easy.md | 40 +-- ...up-your-docker-builds-with-dockerignore.md | 12 - ...ting-detecting-dark-mode-in-the-browser.md | 12 - ...mes-i-feel-we-shoot-ourself-in-the-foot.md | 2 + ...-the-next-big-thing-a-reacts-lover-view.md | 12 - 14 files changed, 266 insertions(+), 446 deletions(-) delete mode 100644 src/content/blog/create-a-qr-code-for-google-drive.md diff --git a/src/content/blog/a-guide-to-directus-for-gatsby-or-sapper-as-cms.md b/src/content/blog/a-guide-to-directus-for-gatsby-or-sapper-as-cms.md index 6853dca..d94fe9f 100644 --- a/src/content/blog/a-guide-to-directus-for-gatsby-or-sapper-as-cms.md +++ b/src/content/blog/a-guide-to-directus-for-gatsby-or-sapper-as-cms.md @@ -25,29 +25,16 @@ The article will focus on Sapper, but the parts related to Directus are identica 3. [Create a super small frontend](#3) 4. [Write a custom hook for Directus that automatically triggers the build whenever content changes in the DB.](#4) -
- -![](images/noah-silliman-doBrZnp_wqA-unsplash.jpg) - -
- -Photo by [Noah Silliman](https://unsplash.com/@noahsilliman?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/rabbit?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) - -
- -
- ## Installing Directus This should be straight forward. These instructions are adopted from the [official docker guide](https://docs.directus.io/installation/docker.html). I will use Docker for this. -``` +```yaml # docker-compose.yml -version: "3.7" +version: '3.7' services: - mysql: image: mysql:5.7 volumes: @@ -57,7 +44,7 @@ services: directus: image: directus/directus:v8-apache ports: - - "8000:80" + - '8000:80' env_file: .env volumes: - ./data/config:/var/directus/config @@ -66,7 +53,7 @@ services: The we run `docker-compose up -d`. After a few seconds we need to initialise Directus. -``` +```bash docker-compose run directus install --email some@email.com --password 1337 ``` @@ -114,7 +101,7 @@ Give permissions to the public user I will not explain how [Sapper](https://sapper.svelte.dev/) works as this is not the focus today. If you don't know Sapper: It's very similar to Nuxt or Next.js with the additional option to even export as static html, so the end result is similar to a Gatsby website. Very powerful and easy to use and code. -``` +```bash # Setup npx degit "sveltejs/sapper-template#rollup" my-blog cd my-blog @@ -127,52 +114,54 @@ yarn run dev Directus has a [JS SDK](https://docs.directus.io/guides/js-sdk.html) and since we have made data public we don't even need a token or authentication. Awesome 🚀 -``` +```bash yarn add @directus/sdk-js ``` First we are going to initialise the SDK. The default project name is simply `directus` -``` +```ts // ./src/lib/api.js import DirectusSDK from '@directus/sdk-js' export const client = new DirectusSDK({ url: 'http://localhost:8000', - project: 'directus' + project: 'directus', }) ``` Then lets make a server side json loader so that the exported site will not even contact the server afterwards. Completely static html. -``` +```ts // ./src/routes/posts.json.js import { client } from '../lib/api' -export async function get (req, res, next) { +export async function get(req, res, next) { try { const { data } = await client.getItems('posts') res.writeHead(200, { - 'Content-Type': 'application/json' + 'Content-Type': 'application/json', }) res.end(JSON.stringify(data)) } catch (e) { res.writeHead(404, { - 'Content-Type': 'application/json' + 'Content-Type': 'application/json', }) - res.end(JSON.stringify({ - message: 'Not found' - })) + res.end( + JSON.stringify({ + message: 'Not found', + }) + ) } } ``` Finally the svelte component. -``` +```svelte // ./src/routes/index.svelte - - + ``` @@ -215,85 +201,102 @@ The code is adapted from [this stackoverflow answer](https://stackoverflow.com/a In essence it's a canvas that listens on our mouse events and fills the pixels with black. Nothing more. -``` +```ts /* jslint esversion: 6, asi: true */ -var canvas, ctx, flag = false, - prevX = 0, - currX = 0, - prevY = 0, - currY = 0, - dot_flag = false; +var canvas, + ctx, + flag = false, + prevX = 0, + currX = 0, + prevY = 0, + currY = 0, + dot_flag = false -var x = "black", - y = 2; +var x = 'black', + y = 2 function init() { - canvas = document.getElementById('can'); - ctx = canvas.getContext("2d"); - w = canvas.width; - h = canvas.height; + canvas = document.getElementById('can') + ctx = canvas.getContext('2d') + w = canvas.width + h = canvas.height - canvas.addEventListener("mousemove", function (e) { - findxy('move', e) - }, false); - canvas.addEventListener("mousedown", function (e) { - findxy('down', e) - }, false); - canvas.addEventListener("mouseup", function (e) { - findxy('up', e) - }, false); - canvas.addEventListener("mouseout", function (e) { - findxy('out', e) - }, false); + canvas.addEventListener( + 'mousemove', + function (e) { + findxy('move', e) + }, + false + ) + canvas.addEventListener( + 'mousedown', + function (e) { + findxy('down', e) + }, + false + ) + canvas.addEventListener( + 'mouseup', + function (e) { + findxy('up', e) + }, + false + ) + canvas.addEventListener( + 'mouseout', + function (e) { + findxy('out', e) + }, + false + ) - - window.document.getElementById('clear').addEventListener('click', erase) + window.document.getElementById('clear').addEventListener('click', erase) } function draw() { - ctx.beginPath(); - ctx.moveTo(prevX, prevY); - ctx.lineTo(currX, currY); - ctx.strokeStyle = x; - ctx.lineWidth = y; - ctx.stroke(); - ctx.closePath(); + ctx.beginPath() + ctx.moveTo(prevX, prevY) + ctx.lineTo(currX, currY) + ctx.strokeStyle = x + ctx.lineWidth = y + ctx.stroke() + ctx.closePath() } function erase() { - ctx.clearRect(0, 0, w, h); + ctx.clearRect(0, 0, w, h) } function findxy(res, e) { - if (res == 'down') { - prevX = currX; - prevY = currY; - currX = e.clientX - canvas.offsetLeft; - currY = e.clientY - canvas.offsetTop; + if (res == 'down') { + prevX = currX + prevY = currY + currX = e.clientX - canvas.offsetLeft + currY = e.clientY - canvas.offsetTop - flag = true; - dot_flag = true; - if (dot_flag) { - ctx.beginPath(); - ctx.fillStyle = x; - ctx.fillRect(currX, currY, 2, 2); - ctx.closePath(); - dot_flag = false; - } + flag = true + dot_flag = true + if (dot_flag) { + ctx.beginPath() + ctx.fillStyle = x + ctx.fillRect(currX, currY, 2, 2) + ctx.closePath() + dot_flag = false } - if (res == 'up' || res == "out") { - flag = false; - } - if (res == 'move') { - if (flag) { - prevX = currX; - prevY = currY; - currX = e.clientX - canvas.offsetLeft; - currY = e.clientY - canvas.offsetTop; - draw(); - } + } + if (res == 'up' || res == 'out') { + flag = false + } + if (res == 'move') { + if (flag) { + prevX = currX + prevY = currY + currX = e.clientX - canvas.offsetLeft + currY = e.clientY - canvas.offsetTop + draw() } + } } init() @@ -301,26 +304,26 @@ init() And not the glue to put this together is the piece of code that listens on the "test" button. -``` -import * as tf from '@tensorflow/tfjs'; +```ts +import * as tf from '@tensorflow/tfjs' let model -tf.loadLayersModel('/model.json').then(m => { - model = m +tf.loadLayersModel('/model.json').then((m) => { + model = m }) window.document.getElementById('test').addEventListener('click', async () => { - const canvas = window.document.querySelector('canvas') + const canvas = window.document.querySelector('canvas') - const { data, width, height } = canvas.getContext('2d').getImageData(0, 0, 28, 28) + const { data, width, height } = canvas.getContext('2d').getImageData(0, 0, 28, 28) - const tensor = tf.tensor(new Uint8Array(data.filter((_, i) => i % 4 === 3)), [1, 28, 28, 1]) - const prediction = model.predict(tensor) - const result = await prediction.data() - const guessed = result.indexOf(1) - console.log(guessed) - window.document.querySelector('#result').innerText = guessed + const tensor = tf.tensor(new Uint8Array(data.filter((_, i) => i % 4 === 3)), [1, 28, 28, 1]) + const prediction = model.predict(tensor) + const result = await prediction.data() + const guessed = result.indexOf(1) + console.log(guessed) + window.document.querySelector('#result').innerText = guessed }) ``` @@ -329,7 +332,7 @@ Here we need to explain a few things. Then, instead of simply passing the data to the tensor. we need to do some magic with `data.filter` in order to get only every 3rd pixel. This is because our canvas has 3 channels + 1 alpha, but we only need to know if the pixel is black or not. We do this by simply filtering for the index mod 4 -``` +```ts data.filter((_, i) => i % 4 === 3) ``` diff --git a/src/content/blog/how-to-search-in-the-jam.md b/src/content/blog/how-to-search-in-the-jam.md index 31cbe31..22e560a 100644 --- a/src/content/blog/how-to-search-in-the-jam.md +++ b/src/content/blog/how-to-search-in-the-jam.md @@ -1,6 +1,7 @@ --- title: 'How to search in the JAM' date: '2020-12-06' +coverImage: './images/uriel-soberanes-gCeH4z9m7bg-unsplash.jpg' categories: - 'coding' tags: @@ -21,18 +22,6 @@ We will look at the following: 2. Search Accuracy & Precision 3. Performance & Size -
- -![Telescope](images/uriel-soberanes-gCeH4z9m7bg-unsplash.jpg) - -
- -Photo by [Uriel Soberanes](https://unsplash.com/@soberanes?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/telescope?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) - -
- -
- We can't rely on a backend as discussed above, so the magic will happen at build time, like everything in the JAM-verse. I've decided to go with the free and open source [lunr.js](https://lunrjs.com/) which is a simple but still quite powerful search engine that can run in the client. @@ -60,7 +49,7 @@ So I'm using [Sapper](https://sapper.svelte.dev/) for this blog so the examples First we need to aggregate all our data. In my case this means all the single pages, blog entries, projects and works. So I created a `/src/routes/search.json.js` file and got to work. -``` +```ts import lunr from 'lunr' import { getAll } from '../lib/wp' @@ -109,7 +98,7 @@ Now I have the "search model" ready. You can have a look: [nicco.io/search.json] It's time to integrate the search into the actual website 🚀 -``` +```html - + ``` The first thing we do is load our preloaded `/search.json` and loading into an instance of `lunr`. This only need to happen once, once the index is loaded we ready to go. -``` +```ts const idx = lunr.Index.load(prebuilt) ``` For the searching itself `lunr` has quite a [few options](https://lunrjs.com/guides/searching.html). The most relevant for me where the wildcard and fuzzy search. While wildcard is good for when we don't have completed a word yet, fuzzy helps us with typos. -``` +```ts const fuzzy = idx.search(needle + '~1') // foo~1 ``` While not explicitly said in the docs I'm guessing they use the [Levenshtein Distance](https://en.wikipedia.org/wiki/Levenshtein_distance), which means `~1` will replace at most 1 char. -``` +```ts const wildcard = idx.search(needle + '*') // fo* ``` diff --git a/src/content/blog/leaving-nextcloud-from-heaven-to-the-depths-of-seafile.md b/src/content/blog/leaving-nextcloud-from-heaven-to-the-depths-of-seafile.md index 3f00b16..2a764e3 100644 --- a/src/content/blog/leaving-nextcloud-from-heaven-to-the-depths-of-seafile.md +++ b/src/content/blog/leaving-nextcloud-from-heaven-to-the-depths-of-seafile.md @@ -20,18 +20,6 @@ There are numerous of plug-ins that can accomplish anything from contacts syncin Trying to be everything at the same time comes at a cost. And that is generally an experience that at least in my experience never feels polished or finished. While the Nextcloud Plug-Ins are incredibly versatile and powerful they also leave room for segmentation and you will notice it. -
- -![](images/pawel-nolbert-xe-ss5Tg2mo-unsplash.jpg) - -
- -Cloud and Ocean - -
- -
- ### The permanent alpha That's what using Nextcloud feels like 75% of the time. I have no insight into the company behind the project but it feels like they are chasing a release cycle for the sake of paper launching unfinished features that compromise in terms of stability and polish. The thing that bothers me the most is that they are constantly marketed as "production ready" when they clearly had not nearly enough QA. @@ -78,7 +66,7 @@ Seafile on the other hand just had the release of it's 8th version (still in bet I had to migrate 2 things: Cal/CardDav for Calendar and Contacts and the files drive itself. Spinning up a Seafile instance was a breeze as I host every single service with docker. -``` +```bash # .env MYSQL_ROOT_PASSWORD=random DB_HOST=db @@ -88,8 +76,8 @@ SEAFILE_ADMIN_EMAIL=me@example.com SEAFILE_ADMIN_PASSWORD=a_very_secret_password ``` -``` -version: "2.0" +```yaml +version: '2.0' services: db: @@ -129,13 +117,13 @@ Since Seafile focuses only on the "Drive" component I had to migrate my contacts You can find my [Radicale docker image here](https://github.com/cupcakearmy/docker-radicale), maybe you find it useful. It supports bcrypt passwords and can be deployed with just the env variables for `USER` and `PASSWORD`. It has been tested with the iOS and macOS native clients. -``` +```bash # .env USER=foo PASSWORD=secret ``` -``` +```yaml # docker-compose.yml version: '3.7' diff --git a/src/content/blog/matomo-vs-ublock-origin.md b/src/content/blog/matomo-vs-ublock-origin.md index f419463..2c7bb72 100644 --- a/src/content/blog/matomo-vs-ublock-origin.md +++ b/src/content/blog/matomo-vs-ublock-origin.md @@ -27,19 +27,24 @@ Then I quickly copied the JS tracker code in my main html template and thought t So turns out that Matomo, being widely used is of course included in many Ad-Blocker lists and therefore my stats did not work. Lets see why: Basically all ad blockers work with lists. Those lists include pattern that if matched will be filtered out. Let's take a look at the default Matomo tracking code: -``` +```html ``` @@ -79,7 +84,7 @@ Luckily Apache has the famous Rewrite module, which will solve all our problems. We can create a `.htaccess` file in the root of our Matomo installation folder, to cloak our requests. -``` +```apache # .htaccess RewriteEngine On RewriteRule ^unicorn matomo.js @@ -88,11 +93,11 @@ RewriteRule ^rainbow matomo.php Now if we request `https://stats.nicco.io/unicorn` we actually get the response for `https://stats.nicco.io/matomo.js` and the same for `rainbow` and `matomo.php`. -``` +```js // Replace in the client -_paq.push(['setTrackerUrl', u+'matomo.php']); // Before -_paq.push(['setTrackerUrl', u+'rainbow']); // After +_paq.push(['setTrackerUrl', u + 'matomo.php']) // Before +_paq.push(['setTrackerUrl', u + 'rainbow']) // After g.src = u + 'matomo.js' // Before g.src = u + 'unicorn' // After @@ -102,7 +107,7 @@ g.src = u + 'unicorn' // After I had to create a minuscule `Dockerfile` as the `Rewrite` module is not enabled per default in the standard Matomo docker image. -``` +```Dockerfile # Dockerfile FROM matomo RUN a2enmod rewrite @@ -118,9 +123,9 @@ Now as you can see it's incredibly easy to mask tracking stuff, and I bet there The `Dockerfile` and the `.htaccess` files are shown above. -``` +```yaml # docker-compose.yml -version: "3.7" +version: '3.7' networks: traefik: @@ -149,13 +154,13 @@ services: - traefik.docker.network=traefik - traefik.port=80 - traefik.backend=matomo - - "traefik.frontend.rule=Host:stats.nicco.io;" + - 'traefik.frontend.rule=Host:stats.nicco.io;' networks: - traefik - default ``` -``` +```bash # .env MYSQL_DATABASE=matomo MYSQL_USER=matomo diff --git a/src/content/blog/monitor-your-self-hosted-services-for-free.md b/src/content/blog/monitor-your-self-hosted-services-for-free.md index adf2ed0..6758bdd 100644 --- a/src/content/blog/monitor-your-self-hosted-services-for-free.md +++ b/src/content/blog/monitor-your-self-hosted-services-for-free.md @@ -25,7 +25,7 @@ For monitoring we will use [Uptime Kuma](https://github.com/louislam/uptime-kuma First we need to [instal docker](https://docs.docker.com/engine/install/debian/#install-using-the-repository) -``` +```bash curl -fsSL https://download.docker.com/linux/debian/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg echo \ @@ -38,7 +38,7 @@ apt install docker-ce docker-ce-cli containerd.io docker-compose-plugin Also we want some basic firewall -``` +```bash apt install ufw ufw allow 80 ufw allow 443 @@ -65,7 +65,7 @@ We only need a `docker-compose.yaml` file now and we should be up and running. I Lets start with Traefik. It will handle all our routing and TLS certificates. Remember to change the acme email down in the `traefik.yaml` -``` +```yaml version: '3.8' networks: @@ -78,36 +78,36 @@ services: image: traefik:2.8 restart: unless-stopped ports: - - "80:80" - - "443:443" + - '80:80' + - '443:443' volumes: - /var/run/docker.sock:/var/run/docker.sock - ./traefik.yaml:/etc/traefik/traefik.yaml:ro - ./data:/data labels: - - "traefik.enable=true" + - 'traefik.enable=true' # HTTP to HTTPS redirection - - "traefik.http.routers.http_catchall.rule=HostRegexp(`{any:.+}`)" - - "traefik.http.routers.http_catchall.entrypoints=insecure" - - "traefik.http.routers.http_catchall.middlewares=https_redirect" - - "traefik.http.middlewares.https_redirect.redirectscheme.scheme=https" - - "traefik.http.middlewares.https_redirect.redirectscheme.permanent=true" + - 'traefik.http.routers.http_catchall.rule=HostRegexp(`{any:.+}`)' + - 'traefik.http.routers.http_catchall.entrypoints=insecure' + - 'traefik.http.routers.http_catchall.middlewares=https_redirect' + - 'traefik.http.middlewares.https_redirect.redirectscheme.scheme=https' + - 'traefik.http.middlewares.https_redirect.redirectscheme.permanent=true' ``` -``` -#Define HTTP and HTTPS entrypoints +```yaml +# Define HTTP and HTTPS entrypoints entryPoints: insecure: - address: ":80" + address: ':80' secure: - address: ":443" + address: ':443' #Dynamic configuration will come from docker labels providers: docker: - endpoint: "unix:///var/run/docker.sock" - network: "proxy" + endpoint: 'unix:///var/run/docker.sock' + network: 'proxy' exposedByDefault: false #Enable acme with http file challenge @@ -123,7 +123,7 @@ certificatesResolvers: To get traefik running we just need to type the following -``` +```bash docker network create proxy docker compose up -d ``` @@ -132,7 +132,7 @@ docker compose up -d The compose file for Kuma is compact. Don't forget to change the domain to yours. -``` +```yaml version: '3.8' networks: @@ -147,10 +147,10 @@ services: volumes: - ./data:/app/data labels: - - traefik.enable=true - - traefik.http.routers.kuma.rule=Host(`status.example.org`) - - traefik.http.routers.kuma.entrypoints=secure - - traefik.http.routers.kuma.tls.certresolver=le + - traefik.enable=true + - traefik.http.routers.kuma.rule=Host(`status.example.org`) + - traefik.http.routers.kuma.entrypoints=secure + - traefik.http.routers.kuma.tls.certresolver=le ``` Now you can navigate to your new monitoring website and create and admin account and setup monitors, alert systems and so on. diff --git a/src/content/blog/rust-in-python-made-easy.md b/src/content/blog/rust-in-python-made-easy.md index a6d30d2..caa2911 100644 --- a/src/content/blog/rust-in-python-made-easy.md +++ b/src/content/blog/rust-in-python-made-easy.md @@ -22,18 +22,6 @@ Overview 4. [lists / arrays](#lists) 5. [complex data types handling](#complex) -
- -![](images/jonathan-chng-HgoKvtKpyHA-unsplash-scaled-1.jpg) - -
- -Photo by [Jonathan Chng](https://unsplash.com/@jon_chng?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/run?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) - -
- -
- Lets assume we want to run the following python code in rust. ``` @@ -54,19 +42,19 @@ Lets see the steps we need to take to achieve this: First lets create a new rust project by running: -``` +```bash cargo new rust_in_python ``` Then lets rename `src/main.rs` to `src/lib.rs` as we want a library and not standalone program. -``` +```bash mv src/main.rs src/lib.rs ``` Now we simply write a hello world function in rust -``` +```rust #[no_mangle] fn hello() { println!("Hello from rust 👋"); @@ -77,20 +65,20 @@ For every function that need to be available to other languages (in our case Pyt The last step is to tell rust to compile to a dynamic library. To do so simply add the following to your `Cargo.toml` config file. -``` +```toml [lib] crate-type = ["dylib"] ``` Now we are ready to build 🚀 -``` +```bash cargo build --release ``` Now just create a `main.py` file and we can import and run our function. -``` +```py from ctypes import CDLL lib = CDLL("target/release/librust_in_python.dylib") @@ -99,7 +87,7 @@ lib.hello() And if you run it you will be greeted from rust. No need to install, the `ctypes` package is included the standard python library. -``` +```bash python main.py ``` @@ -111,7 +99,7 @@ Before we start I would like to remind you that python is untyped whereas rust o First lets write the simple add function in rust -``` +```rust #[no_mangle] fn add(a: f64, b: f64) -> f64 { return a + b; @@ -120,13 +108,13 @@ fn add(a: f64, b: f64) -> f64 { Don't forget to build again 😉 -``` +```bash cargo build --release ``` Now to the python part -``` +```py from ctypes import CDLL, c_double lib = CDLL("target/release/librust_in_python.dylib") @@ -158,7 +146,7 @@ So what about lists? Unfortunately I have not found a way to use Vectors for dyn ###### Rust -``` +```rust #[no_mangle] fn sum(arr: [i32; 5]) -> i32 { let mut total: i32 = 0; @@ -171,7 +159,7 @@ fn sum(arr: [i32; 5]) -> i32 { ###### Python -``` +```python from ctypes import CDLL, c_int lib = CDLL("target/release/librust_in_python.dylib") @@ -191,7 +179,7 @@ Often it can be very useful to send and/or receive data in a structured, compact ###### Rust -``` +```rust #[repr(C)] pub struct Point { pub x: f64, @@ -206,7 +194,7 @@ fn greet_point(p: Point) { ###### Python -``` +```python from ctypes import CDLL, Structure, c_double lib = CDLL("target/release/librust_in_python.dylib") diff --git a/src/content/blog/speed-up-your-docker-builds-with-dockerignore.md b/src/content/blog/speed-up-your-docker-builds-with-dockerignore.md index 656f428..9aaa9c0 100644 --- a/src/content/blog/speed-up-your-docker-builds-with-dockerignore.md +++ b/src/content/blog/speed-up-your-docker-builds-with-dockerignore.md @@ -12,18 +12,6 @@ So you ever wondered why your docker build takes so long to startup when all you Fear no more! `.dockerignore` to the rescue ⚓️. -
- -![](images/thomas-kelley-t20pc32VbrU-unsplash-scaled-1.jpg) - -
- -Photo by [Thomas Kelley](https://unsplash.com/@thkelley?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/whale?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) - -
- -
- Whenever you build a docker image the first thing you will always see is the following: ```bash diff --git a/src/content/blog/supporting-detecting-dark-mode-in-the-browser.md b/src/content/blog/supporting-detecting-dark-mode-in-the-browser.md index 5a3b49e..13f3def 100644 --- a/src/content/blog/supporting-detecting-dark-mode-in-the-browser.md +++ b/src/content/blog/supporting-detecting-dark-mode-in-the-browser.md @@ -18,18 +18,6 @@ We will look at a few ways how to detect and handle dark modes in 2020: 2. [JS](#js) 3. [React](#react) -
- -![](images/davisco-5E5N49RWtbA-unsplash-scaled-1.jpg) - -
- -Photo by [davisco](https://unsplash.com/@codytdavis?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/contrast?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) - -
- -
- ## Pure CSS First lets have a look how we can do this using only CSS. There is a new css media query that is supported by [almost any browser](https://caniuse.com/#feat=prefers-color-scheme) right now. diff --git a/src/content/blog/why-i-love-js-but-sometimes-i-feel-we-shoot-ourself-in-the-foot.md b/src/content/blog/why-i-love-js-but-sometimes-i-feel-we-shoot-ourself-in-the-foot.md index a80c60a..17f6da2 100644 --- a/src/content/blog/why-i-love-js-but-sometimes-i-feel-we-shoot-ourself-in-the-foot.md +++ b/src/content/blog/why-i-love-js-but-sometimes-i-feel-we-shoot-ourself-in-the-foot.md @@ -3,6 +3,8 @@ title: 'Why I love JS but sometimes I feel we shoot ourself in the foot.' date: '2020-05-29' categories: - 'general' +tags: + - rant --- Let's start by saying this: I absolutely love JS & Typescript, they are my favourite languages and I would not want to live without them. diff --git a/src/content/blog/why-i-think-svelte-is-the-next-big-thing-a-reacts-lover-view.md b/src/content/blog/why-i-think-svelte-is-the-next-big-thing-a-reacts-lover-view.md index 13bd157..b33bcc9 100644 --- a/src/content/blog/why-i-think-svelte-is-the-next-big-thing-a-reacts-lover-view.md +++ b/src/content/blog/why-i-think-svelte-is-the-next-big-thing-a-reacts-lover-view.md @@ -17,18 +17,6 @@ So maybe one of you is thinking the same thing. Why [Svelte](https://svelte.dev/ This is not a tutorial, for that check the amazing [official tutorial](https://svelte.dev/tutorial/basics) which teaches you everything from the basics to more advanced stuff. -
- -![](images/alessandra-caretto-cAY9X4rPG3g-unsplash.jpg) - -
- -Photo by [Alessandra Caretto](https://unsplash.com/@alessandracaretto?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/wheel?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) - -
- -
- Why is there a wheel? Well quite simply: Svelte IS reinventing the wheel. You see: what are most of the frameworks doing nowadays? They provide a way to code user interfaces with a component approach. This makes a lot of sense because we can reuse them, they are boxed items that stand for them self. Modularizing and splitting up concerns make big apps easier to write and afterwards maintain.