mirror of
https://github.com/cupcakearmy/nicco.io.git
synced 2024-12-22 08:06:29 +00:00
cleanup
This commit is contained in:
parent
b5af0cae15
commit
75d19fa5d9
@ -25,29 +25,16 @@ The article will focus on Sapper, but the parts related to Directus are identica
|
|||||||
3. [Create a super small frontend](#3)
|
3. [Create a super small frontend](#3)
|
||||||
4. [Write a custom hook for Directus that automatically triggers the build whenever content changes in the DB.](#4)
|
4. [Write a custom hook for Directus that automatically triggers the build whenever content changes in the DB.](#4)
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](images/noah-silliman-doBrZnp_wqA-unsplash.jpg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Photo by [Noah Silliman](https://unsplash.com/@noahsilliman?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/rabbit?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText)
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
## Installing Directus
|
## Installing Directus
|
||||||
|
|
||||||
This should be straight forward. These instructions are adopted from the [official docker guide](https://docs.directus.io/installation/docker.html). I will use Docker for this.
|
This should be straight forward. These instructions are adopted from the [official docker guide](https://docs.directus.io/installation/docker.html). I will use Docker for this.
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
# docker-compose.yml
|
# docker-compose.yml
|
||||||
|
|
||||||
version: "3.7"
|
version: '3.7'
|
||||||
|
|
||||||
services:
|
services:
|
||||||
|
|
||||||
mysql:
|
mysql:
|
||||||
image: mysql:5.7
|
image: mysql:5.7
|
||||||
volumes:
|
volumes:
|
||||||
@ -57,7 +44,7 @@ services:
|
|||||||
directus:
|
directus:
|
||||||
image: directus/directus:v8-apache
|
image: directus/directus:v8-apache
|
||||||
ports:
|
ports:
|
||||||
- "8000:80"
|
- '8000:80'
|
||||||
env_file: .env
|
env_file: .env
|
||||||
volumes:
|
volumes:
|
||||||
- ./data/config:/var/directus/config
|
- ./data/config:/var/directus/config
|
||||||
@ -66,7 +53,7 @@ services:
|
|||||||
|
|
||||||
The we run `docker-compose up -d`. After a few seconds we need to initialise Directus.
|
The we run `docker-compose up -d`. After a few seconds we need to initialise Directus.
|
||||||
|
|
||||||
```
|
```bash
|
||||||
docker-compose run directus install --email some@email.com --password 1337
|
docker-compose run directus install --email some@email.com --password 1337
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -114,7 +101,7 @@ Give permissions to the public user
|
|||||||
|
|
||||||
I will not explain how [Sapper](https://sapper.svelte.dev/) works as this is not the focus today. If you don't know Sapper: It's very similar to Nuxt or Next.js with the additional option to even export as static html, so the end result is similar to a Gatsby website. Very powerful and easy to use and code.
|
I will not explain how [Sapper](https://sapper.svelte.dev/) works as this is not the focus today. If you don't know Sapper: It's very similar to Nuxt or Next.js with the additional option to even export as static html, so the end result is similar to a Gatsby website. Very powerful and easy to use and code.
|
||||||
|
|
||||||
```
|
```bash
|
||||||
# Setup
|
# Setup
|
||||||
npx degit "sveltejs/sapper-template#rollup" my-blog
|
npx degit "sveltejs/sapper-template#rollup" my-blog
|
||||||
cd my-blog
|
cd my-blog
|
||||||
@ -127,52 +114,54 @@ yarn run dev
|
|||||||
|
|
||||||
Directus has a [JS SDK](https://docs.directus.io/guides/js-sdk.html) and since we have made data public we don't even need a token or authentication. Awesome 🚀
|
Directus has a [JS SDK](https://docs.directus.io/guides/js-sdk.html) and since we have made data public we don't even need a token or authentication. Awesome 🚀
|
||||||
|
|
||||||
```
|
```bash
|
||||||
yarn add @directus/sdk-js
|
yarn add @directus/sdk-js
|
||||||
```
|
```
|
||||||
|
|
||||||
First we are going to initialise the SDK. The default project name is simply `directus`
|
First we are going to initialise the SDK. The default project name is simply `directus`
|
||||||
|
|
||||||
```
|
```ts
|
||||||
// ./src/lib/api.js
|
// ./src/lib/api.js
|
||||||
|
|
||||||
import DirectusSDK from '@directus/sdk-js'
|
import DirectusSDK from '@directus/sdk-js'
|
||||||
|
|
||||||
export const client = new DirectusSDK({
|
export const client = new DirectusSDK({
|
||||||
url: 'http://localhost:8000',
|
url: 'http://localhost:8000',
|
||||||
project: 'directus'
|
project: 'directus',
|
||||||
})
|
})
|
||||||
```
|
```
|
||||||
|
|
||||||
Then lets make a server side json loader so that the exported site will not even contact the server afterwards. Completely static html.
|
Then lets make a server side json loader so that the exported site will not even contact the server afterwards. Completely static html.
|
||||||
|
|
||||||
```
|
```ts
|
||||||
// ./src/routes/posts.json.js
|
// ./src/routes/posts.json.js
|
||||||
|
|
||||||
import { client } from '../lib/api'
|
import { client } from '../lib/api'
|
||||||
|
|
||||||
export async function get (req, res, next) {
|
export async function get(req, res, next) {
|
||||||
try {
|
try {
|
||||||
const { data } = await client.getItems('posts')
|
const { data } = await client.getItems('posts')
|
||||||
|
|
||||||
res.writeHead(200, {
|
res.writeHead(200, {
|
||||||
'Content-Type': 'application/json'
|
'Content-Type': 'application/json',
|
||||||
})
|
})
|
||||||
res.end(JSON.stringify(data))
|
res.end(JSON.stringify(data))
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
res.writeHead(404, {
|
res.writeHead(404, {
|
||||||
'Content-Type': 'application/json'
|
'Content-Type': 'application/json',
|
||||||
})
|
})
|
||||||
res.end(JSON.stringify({
|
res.end(
|
||||||
message: 'Not found'
|
JSON.stringify({
|
||||||
}))
|
message: 'Not found',
|
||||||
|
})
|
||||||
|
)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
Finally the svelte component.
|
Finally the svelte component.
|
||||||
|
|
||||||
```
|
```svelte
|
||||||
// ./src/routes/index.svelte
|
// ./src/routes/index.svelte
|
||||||
|
|
||||||
<script context="module">
|
<script context="module">
|
||||||
@ -205,7 +194,7 @@ I will illustrate the case for [Drone](https://drone.io/), but the approach can
|
|||||||
|
|
||||||
For that we create a new php file and give it a name. In my case: `drone-hook.php`
|
For that we create a new php file and give it a name. In my case: `drone-hook.php`
|
||||||
|
|
||||||
```
|
```php
|
||||||
# ./hooks/drone-hook.php
|
# ./hooks/drone-hook.php
|
||||||
|
|
||||||
<?php
|
<?php
|
||||||
@ -236,7 +225,7 @@ return [
|
|||||||
|
|
||||||
I've also put the token inside of the `.env` file so that I can safely check my code into a repo and not having to worry about having a token lying around in the codebase.
|
I've also put the token inside of the `.env` file so that I can safely check my code into a repo and not having to worry about having a token lying around in the codebase.
|
||||||
|
|
||||||
```
|
```bash
|
||||||
# .env
|
# .env
|
||||||
|
|
||||||
...
|
...
|
||||||
@ -247,7 +236,7 @@ DRONE_TOKEN=my-drone-token
|
|||||||
|
|
||||||
The last thing to do is actually load the code into Directus. You can simply mount the `./hooks` folder we just created into the container and reload.
|
The last thing to do is actually load the code into Directus. You can simply mount the `./hooks` folder we just created into the container and reload.
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
# docker-compose.yml
|
# docker-compose.yml
|
||||||
|
|
||||||
version: "3.7"
|
version: "3.7"
|
||||||
|
@ -16,18 +16,6 @@ For this article I will take my own [project](https://github.com/cupcakearmy/aut
|
|||||||
Also I will base this guide on [Drone](https://drone.io/). But I'm sure there is the same workflow for jenkins/circle/whatever CI/CD system you are using.
|
Also I will base this guide on [Drone](https://drone.io/). But I'm sure there is the same workflow for jenkins/circle/whatever CI/CD system you are using.
|
||||||
This means I'm assuming you have a repository already running with Drone.
|
This means I'm assuming you have a repository already running with Drone.
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](images/franck-v-U3sOwViXhkY-unsplash-scaled-1.jpg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Photo by [Franck V.](https://unsplash.com/@franckinjapan?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/robot?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText)
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
The first thing we will need is an access token for the Github API.
|
The first thing we will need is an access token for the Github API.
|
||||||
You can get them here [https://github.com/settings/tokens](https://github.com/settings/tokens). I called my `Drone` and you need to check the permissions for the repos as follows.
|
You can get them here [https://github.com/settings/tokens](https://github.com/settings/tokens). I called my `Drone` and you need to check the permissions for the repos as follows.
|
||||||
|
|
||||||
@ -69,39 +57,37 @@ Now it's time to edit our drone file and make everything automatic. The flow at
|
|||||||
|
|
||||||
Simple right? Lets see how!
|
Simple right? Lets see how!
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
# .drone.yml
|
# .drone.yml
|
||||||
---
|
---
|
||||||
kind: pipeline
|
kind: pipeline
|
||||||
name: default
|
name: default
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: build
|
- name: build
|
||||||
image: node
|
image: node
|
||||||
pull: always
|
pull: always
|
||||||
commands:
|
commands:
|
||||||
- yarn
|
- yarn
|
||||||
- yarn run bin
|
- yarn run bin
|
||||||
when:
|
when:
|
||||||
event: tag
|
event: tag
|
||||||
|
|
||||||
- name: publish
|
- name: publish
|
||||||
image: plugins/github-release
|
image: plugins/github-release
|
||||||
pull: always
|
pull: always
|
||||||
settings:
|
settings:
|
||||||
api_key:
|
api_key:
|
||||||
from_secret: github
|
from_secret: github
|
||||||
files: bin/*
|
files: bin/*
|
||||||
checksum:
|
checksum:
|
||||||
- sha512
|
- sha512
|
||||||
note: CHANGELOG.md
|
note: CHANGELOG.md
|
||||||
when:
|
when:
|
||||||
event: tag
|
event: tag
|
||||||
---
|
---
|
||||||
kind: signature
|
kind: signature
|
||||||
hmac: 3b1f235f6a6f0ee1aa3f572d0833c4f0eec931dbe0378f31b9efa336a7462912
|
hmac: 3b1f235f6a6f0ee1aa3f572d0833c4f0eec931dbe0378f31b9efa336a7462912
|
||||||
|
|
||||||
...
|
|
||||||
```
|
```
|
||||||
|
|
||||||
Lets understand what is happening here:
|
Lets understand what is happening here:
|
||||||
@ -124,13 +110,13 @@ The `checksum` setting is also amazing because as the name suggests the plugin a
|
|||||||
|
|
||||||
Simple! First tag your code with the following command
|
Simple! First tag your code with the following command
|
||||||
|
|
||||||
```
|
```bash
|
||||||
git tag 1.2.3
|
git tag 1.2.3
|
||||||
```
|
```
|
||||||
|
|
||||||
Now push the tag and drone will be on its way
|
Now push the tag and drone will be on its way
|
||||||
|
|
||||||
```
|
```bash
|
||||||
git push --tags
|
git push --tags
|
||||||
```
|
```
|
||||||
|
|
||||||
|
@ -8,18 +8,6 @@ Today, I want to share how to create and host your own image transformation serv
|
|||||||
|
|
||||||
The building blocks will be [imgproxy](https://github.com/imgproxy/imgproxy) and [nginx](https://nginx.org/). The former is a battle tested and fast image server with support for most image operations, while nginx should not need an introduction.
|
The building blocks will be [imgproxy](https://github.com/imgproxy/imgproxy) and [nginx](https://nginx.org/). The former is a battle tested and fast image server with support for most image operations, while nginx should not need an introduction.
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](images/meagan-carsience-QGnm_F_nd1E-unsplash1.jpg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Photo by [Meagan Carsience](https://unsplash.com/@mcarsience_photography?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/photos/QGnm_F_nd1E?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText)
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
While imgproxy is the core of this operation, it does not support caching. This is intentional, as it's intended to be run behind a proxy. For that, nginx is the tool of choice, as it enables us to easily setup caching rules to avoid generating the same image twice in a given cache interval. Everything will be done in docker containers, but the concept, of course, extends to bare metal too.
|
While imgproxy is the core of this operation, it does not support caching. This is intentional, as it's intended to be run behind a proxy. For that, nginx is the tool of choice, as it enables us to easily setup caching rules to avoid generating the same image twice in a given cache interval. Everything will be done in docker containers, but the concept, of course, extends to bare metal too.
|
||||||
|
|
||||||
## Setup
|
## Setup
|
||||||
@ -30,7 +18,7 @@ It's generally advised to use signed URLs if possible. In my case, there was no
|
|||||||
|
|
||||||
Below is docker file used. Required is only the `IMGPROXY_BIND` as otherwise nginx cannot connect to our image container. The other options are up to you and are just here for a quick setup.
|
Below is docker file used. Required is only the `IMGPROXY_BIND` as otherwise nginx cannot connect to our image container. The other options are up to you and are just here for a quick setup.
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
# docker-compose.yaml
|
# docker-compose.yaml
|
||||||
version: '3.8'
|
version: '3.8'
|
||||||
|
|
||||||
@ -65,7 +53,7 @@ services:
|
|||||||
|
|
||||||
The more interesting part is the nginx configuration file below. In this case, we target 30 days as a cache TTL. This could be easily increased if we are only talking about static images.
|
The more interesting part is the nginx configuration file below. In this case, we target 30 days as a cache TTL. This could be easily increased if we are only talking about static images.
|
||||||
|
|
||||||
```
|
```nginx
|
||||||
# Set cache to 30 days, 1GB.
|
# Set cache to 30 days, 1GB.
|
||||||
# Only use the uri as the cache key, as it's the only input for imageproxy.
|
# Only use the uri as the cache key, as it's the only input for imageproxy.
|
||||||
proxy_cache_path /tmp levels=1:2 keys_zone=images:8m max_size=1g inactive=30d;
|
proxy_cache_path /tmp levels=1:2 keys_zone=images:8m max_size=1g inactive=30d;
|
||||||
|
@ -1,82 +0,0 @@
|
|||||||
---
|
|
||||||
title: "Create a QR code for Google Drive"
|
|
||||||
date: "2022-03-17"
|
|
||||||
---
|
|
||||||
|
|
||||||
So you want to make a QR code to a google drive file? It's actually quite easy, I'll show you!
|
|
||||||
|
|
||||||
## 1\. Upload the file and get the shared link
|
|
||||||
|
|
||||||
As shown in the video below the first thing is to upload your file (in this case a PDF) and create a sharable link.
|
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Uploading and generating a link for a google drive file
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
## 2\. Convert the link to a download link
|
|
||||||
|
|
||||||
```
|
|
||||||
https://drive.google.com/file/d/1LZ09_aJnGy1aHY0DEuOEFGU4mon2ijir/view?usp=sharing
|
|
||||||
```
|
|
||||||
|
|
||||||
If we simply use the provided link (example above) it won't download the file, but create a preview of it.
|
|
||||||
|
|
||||||
If we want a direct download we need to change it to that below:
|
|
||||||
|
|
||||||
```
|
|
||||||
https://drive.google.com/uc?export=download&id=1LZ09_aJnGy1aHY0DEuOEFGU4mon2ijir
|
|
||||||
```
|
|
||||||
|
|
||||||
To summarise:
|
|
||||||
|
|
||||||
```
|
|
||||||
https://drive.google.com/file/d/<id>/view?usp=sharing
|
|
||||||
⬇️
|
|
||||||
https://drive.google.com/uc?export=download&id=<id>
|
|
||||||
```
|
|
||||||
|
|
||||||
Note that the _`<id>`_ part will be different for your file. The rest is the same.
|
|
||||||
|
|
||||||
## 3\. Create the QR Code
|
|
||||||
|
|
||||||
To create a QR code there is a very good free website called: [the-qrcode-generator.com](https://www.the-qrcode-generator.com/). Here you simply paste the link and get your QR Code.
|
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](https://api.nicco.io/wp-content/uploads/2022/03/QR-Big.svg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Big QR code
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
## 4\. Make the QR code smaller and track clicks
|
|
||||||
|
|
||||||
If you want to have a smaller and cleaner QR code you can use a URL shortener like [Cuttly](https://cutt.ly/) to do so. With Cuttly the URL gets shorter and you can see how many people clicked on it. The new link and QR code then look something like this:
|
|
||||||
|
|
||||||
```
|
|
||||||
https://cutt.ly/CSonJs9
|
|
||||||
```
|
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](https://api.nicco.io/wp-content/uploads/2022/03/QR-Small.svg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Small QR code
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
@ -25,18 +25,6 @@ Today we will look on how to train a simple mnist digit recogniser and then expo
|
|||||||
|
|
||||||
Also I am not going to explain what machine learning is, as there are enough guides, videos, podcasts, ... that already do a much better job than I could and would be outside the scope of this article.
|
Also I am not going to explain what machine learning is, as there are enough guides, videos, podcasts, ... that already do a much better job than I could and would be outside the scope of this article.
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](images/natasha-connell-byp5TTxUbL0-unsplash-scaled-1.jpg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Photo by [Natasha Connell](https://unsplash.com/@natcon773?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/brain?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText)
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
So the first thing we need to understand is that we will not train the model in the browser. That is a job for GPUs and the goal here is only to use a pre-trained model inside of the browser. Training is a much more resource intensive task than simply using the net.
|
So the first thing we need to understand is that we will not train the model in the browser. That is a job for GPUs and the goal here is only to use a pre-trained model inside of the browser. Training is a much more resource intensive task than simply using the net.
|
||||||
|
|
||||||
## Training the model
|
## Training the model
|
||||||
@ -46,7 +34,7 @@ So, the first step is to actually have a model. I will do this in tensorflow 2.0
|
|||||||
The code below is basically an adapted version of the [keras hello world example](https://keras.io/examples/mnist_cnn/).
|
The code below is basically an adapted version of the [keras hello world example](https://keras.io/examples/mnist_cnn/).
|
||||||
If you want to run the code yourself (which you should!) simply head over to [Google Colab](https://colab.research.google.com), create a new file and just paste the code. There you can run it for free on GPUs which is pretty dope!
|
If you want to run the code yourself (which you should!) simply head over to [Google Colab](https://colab.research.google.com), create a new file and just paste the code. There you can run it for free on GPUs which is pretty dope!
|
||||||
|
|
||||||
```
|
```py
|
||||||
from tensorflow.keras.datasets import mnist
|
from tensorflow.keras.datasets import mnist
|
||||||
from tensorflow.keras.models import Sequential
|
from tensorflow.keras.models import Sequential
|
||||||
from tensorflow.keras.layers import Dense, Dropout, Flatten
|
from tensorflow.keras.layers import Dense, Dropout, Flatten
|
||||||
@ -106,7 +94,7 @@ Unfortunately this is not compatible with tensorflow-js. So we need another way.
|
|||||||
|
|
||||||
There is a package called tensorflowjs for python (confusing right? 😅) that provides the functionality we need
|
There is a package called tensorflowjs for python (confusing right? 😅) that provides the functionality we need
|
||||||
|
|
||||||
```
|
```ts
|
||||||
import tensorflowjs as tfjs
|
import tensorflowjs as tfjs
|
||||||
|
|
||||||
tfjs.converters.save_keras_model(model, './js')
|
tfjs.converters.save_keras_model(model, './js')
|
||||||
@ -119,19 +107,19 @@ Inside there you will find a `model.json` that basically describes the structure
|
|||||||
|
|
||||||
Now we are ready to import that. First we need to install the `@tensorflow/tfjs` package.
|
Now we are ready to import that. First we need to install the `@tensorflow/tfjs` package.
|
||||||
|
|
||||||
```
|
```ts
|
||||||
import * as tf from '@tensorflow/tfjs';
|
import * as tf from '@tensorflow/tfjs'
|
||||||
|
|
||||||
let model
|
let model
|
||||||
|
|
||||||
tf.loadLayersModel('/model.json').then(m => {
|
tf.loadLayersModel('/model.json').then((m) => {
|
||||||
model = m
|
model = m
|
||||||
})
|
})
|
||||||
```
|
```
|
||||||
|
|
||||||
Ok how do I use that now?
|
Ok how do I use that now?
|
||||||
|
|
||||||
```
|
```ts
|
||||||
const tensor = tf.tensor(new Uint8Array(ourData), [1, 28, 28, 1])
|
const tensor = tf.tensor(new Uint8Array(ourData), [1, 28, 28, 1])
|
||||||
const prediction = model.predict(tensor)
|
const prediction = model.predict(tensor)
|
||||||
```
|
```
|
||||||
@ -146,67 +134,65 @@ I'm not gonna talk about what bundler, etc. I'm using. If you interested simply
|
|||||||
|
|
||||||
First lets write some basic html for the skeleton of our page.
|
First lets write some basic html for the skeleton of our page.
|
||||||
|
|
||||||
```
|
```html
|
||||||
<html>
|
<html>
|
||||||
|
<head>
|
||||||
<head>
|
|
||||||
<style>
|
<style>
|
||||||
* {
|
* {
|
||||||
box-sizing: border-box;
|
box-sizing: border-box;
|
||||||
font-family: monospace;
|
font-family: monospace;
|
||||||
}
|
}
|
||||||
|
|
||||||
html,
|
html,
|
||||||
body {
|
body {
|
||||||
padding: 0;
|
padding: 0;
|
||||||
margin: 0;
|
margin: 0;
|
||||||
height: 100vh;
|
height: 100vh;
|
||||||
width: 100vw;
|
width: 100vw;
|
||||||
display: flex;
|
display: flex;
|
||||||
justify-content: center;
|
justify-content: center;
|
||||||
align-items: center;
|
align-items: center;
|
||||||
}
|
}
|
||||||
|
|
||||||
body>div {
|
body > div {
|
||||||
text-align: center;
|
text-align: center;
|
||||||
}
|
}
|
||||||
|
|
||||||
div canvas {
|
div canvas {
|
||||||
display: inline-block;
|
display: inline-block;
|
||||||
border: 1px solid;
|
border: 1px solid;
|
||||||
}
|
}
|
||||||
|
|
||||||
div input {
|
div input {
|
||||||
display: inline-block;
|
display: inline-block;
|
||||||
margin-top: .5em;
|
margin-top: 0.5em;
|
||||||
padding: .5em 2em;
|
padding: 0.5em 2em;
|
||||||
background: white;
|
background: white;
|
||||||
outline: none;
|
outline: none;
|
||||||
border: 1px solid;
|
border: 1px solid;
|
||||||
font-weight: bold;
|
font-weight: bold;
|
||||||
}
|
}
|
||||||
</style>
|
</style>
|
||||||
</head>
|
</head>
|
||||||
|
|
||||||
<body>
|
<body>
|
||||||
<div>
|
<div>
|
||||||
<h1>MNIST (Pretrained)</h1>
|
<h1>MNIST (Pretrained)</h1>
|
||||||
<canvas id="can" width="28" height="28"></canvas>
|
<canvas id="can" width="28" height="28"></canvas>
|
||||||
<br />
|
<br />
|
||||||
<input id="clear" type="button" value="clear">
|
<input id="clear" type="button" value="clear" />
|
||||||
<br />
|
<br />
|
||||||
<input id="test" type="button" value="test">
|
<input id="test" type="button" value="test" />
|
||||||
<br />
|
<br />
|
||||||
<h2 id="result"></h2>
|
<h2 id="result"></h2>
|
||||||
<a href="https://github.com/cupcakearmy/mnist">
|
<a href="https://github.com/cupcakearmy/mnist">
|
||||||
<h3>source code</h3>
|
<h3>source code</h3>
|
||||||
</a>
|
</a>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<script src="./tf.js"></script>
|
<script src="./tf.js"></script>
|
||||||
<script src="./canvas.js"></script>
|
<script src="./canvas.js"></script>
|
||||||
</body>
|
</body>
|
||||||
|
|
||||||
</html>
|
</html>
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -215,85 +201,102 @@ The code is adapted from [this stackoverflow answer](https://stackoverflow.com/a
|
|||||||
|
|
||||||
In essence it's a canvas that listens on our mouse events and fills the pixels with black. Nothing more.
|
In essence it's a canvas that listens on our mouse events and fills the pixels with black. Nothing more.
|
||||||
|
|
||||||
```
|
```ts
|
||||||
/* jslint esversion: 6, asi: true */
|
/* jslint esversion: 6, asi: true */
|
||||||
|
|
||||||
var canvas, ctx, flag = false,
|
var canvas,
|
||||||
prevX = 0,
|
ctx,
|
||||||
currX = 0,
|
flag = false,
|
||||||
prevY = 0,
|
prevX = 0,
|
||||||
currY = 0,
|
currX = 0,
|
||||||
dot_flag = false;
|
prevY = 0,
|
||||||
|
currY = 0,
|
||||||
|
dot_flag = false
|
||||||
|
|
||||||
var x = "black",
|
var x = 'black',
|
||||||
y = 2;
|
y = 2
|
||||||
|
|
||||||
function init() {
|
function init() {
|
||||||
canvas = document.getElementById('can');
|
canvas = document.getElementById('can')
|
||||||
ctx = canvas.getContext("2d");
|
ctx = canvas.getContext('2d')
|
||||||
w = canvas.width;
|
w = canvas.width
|
||||||
h = canvas.height;
|
h = canvas.height
|
||||||
|
|
||||||
canvas.addEventListener("mousemove", function (e) {
|
canvas.addEventListener(
|
||||||
findxy('move', e)
|
'mousemove',
|
||||||
}, false);
|
function (e) {
|
||||||
canvas.addEventListener("mousedown", function (e) {
|
findxy('move', e)
|
||||||
findxy('down', e)
|
},
|
||||||
}, false);
|
false
|
||||||
canvas.addEventListener("mouseup", function (e) {
|
)
|
||||||
findxy('up', e)
|
canvas.addEventListener(
|
||||||
}, false);
|
'mousedown',
|
||||||
canvas.addEventListener("mouseout", function (e) {
|
function (e) {
|
||||||
findxy('out', e)
|
findxy('down', e)
|
||||||
}, false);
|
},
|
||||||
|
false
|
||||||
|
)
|
||||||
|
canvas.addEventListener(
|
||||||
|
'mouseup',
|
||||||
|
function (e) {
|
||||||
|
findxy('up', e)
|
||||||
|
},
|
||||||
|
false
|
||||||
|
)
|
||||||
|
canvas.addEventListener(
|
||||||
|
'mouseout',
|
||||||
|
function (e) {
|
||||||
|
findxy('out', e)
|
||||||
|
},
|
||||||
|
false
|
||||||
|
)
|
||||||
|
|
||||||
|
window.document.getElementById('clear').addEventListener('click', erase)
|
||||||
window.document.getElementById('clear').addEventListener('click', erase)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function draw() {
|
function draw() {
|
||||||
ctx.beginPath();
|
ctx.beginPath()
|
||||||
ctx.moveTo(prevX, prevY);
|
ctx.moveTo(prevX, prevY)
|
||||||
ctx.lineTo(currX, currY);
|
ctx.lineTo(currX, currY)
|
||||||
ctx.strokeStyle = x;
|
ctx.strokeStyle = x
|
||||||
ctx.lineWidth = y;
|
ctx.lineWidth = y
|
||||||
ctx.stroke();
|
ctx.stroke()
|
||||||
ctx.closePath();
|
ctx.closePath()
|
||||||
}
|
}
|
||||||
|
|
||||||
function erase() {
|
function erase() {
|
||||||
ctx.clearRect(0, 0, w, h);
|
ctx.clearRect(0, 0, w, h)
|
||||||
}
|
}
|
||||||
|
|
||||||
function findxy(res, e) {
|
function findxy(res, e) {
|
||||||
if (res == 'down') {
|
if (res == 'down') {
|
||||||
prevX = currX;
|
prevX = currX
|
||||||
prevY = currY;
|
prevY = currY
|
||||||
currX = e.clientX - canvas.offsetLeft;
|
currX = e.clientX - canvas.offsetLeft
|
||||||
currY = e.clientY - canvas.offsetTop;
|
currY = e.clientY - canvas.offsetTop
|
||||||
|
|
||||||
flag = true;
|
flag = true
|
||||||
dot_flag = true;
|
dot_flag = true
|
||||||
if (dot_flag) {
|
if (dot_flag) {
|
||||||
ctx.beginPath();
|
ctx.beginPath()
|
||||||
ctx.fillStyle = x;
|
ctx.fillStyle = x
|
||||||
ctx.fillRect(currX, currY, 2, 2);
|
ctx.fillRect(currX, currY, 2, 2)
|
||||||
ctx.closePath();
|
ctx.closePath()
|
||||||
dot_flag = false;
|
dot_flag = false
|
||||||
}
|
|
||||||
}
|
}
|
||||||
if (res == 'up' || res == "out") {
|
}
|
||||||
flag = false;
|
if (res == 'up' || res == 'out') {
|
||||||
}
|
flag = false
|
||||||
if (res == 'move') {
|
}
|
||||||
if (flag) {
|
if (res == 'move') {
|
||||||
prevX = currX;
|
if (flag) {
|
||||||
prevY = currY;
|
prevX = currX
|
||||||
currX = e.clientX - canvas.offsetLeft;
|
prevY = currY
|
||||||
currY = e.clientY - canvas.offsetTop;
|
currX = e.clientX - canvas.offsetLeft
|
||||||
draw();
|
currY = e.clientY - canvas.offsetTop
|
||||||
}
|
draw()
|
||||||
}
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
init()
|
init()
|
||||||
@ -301,26 +304,26 @@ init()
|
|||||||
|
|
||||||
And not the glue to put this together is the piece of code that listens on the "test" button.
|
And not the glue to put this together is the piece of code that listens on the "test" button.
|
||||||
|
|
||||||
```
|
```ts
|
||||||
import * as tf from '@tensorflow/tfjs';
|
import * as tf from '@tensorflow/tfjs'
|
||||||
|
|
||||||
let model
|
let model
|
||||||
|
|
||||||
tf.loadLayersModel('/model.json').then(m => {
|
tf.loadLayersModel('/model.json').then((m) => {
|
||||||
model = m
|
model = m
|
||||||
})
|
})
|
||||||
|
|
||||||
window.document.getElementById('test').addEventListener('click', async () => {
|
window.document.getElementById('test').addEventListener('click', async () => {
|
||||||
const canvas = window.document.querySelector('canvas')
|
const canvas = window.document.querySelector('canvas')
|
||||||
|
|
||||||
const { data, width, height } = canvas.getContext('2d').getImageData(0, 0, 28, 28)
|
const { data, width, height } = canvas.getContext('2d').getImageData(0, 0, 28, 28)
|
||||||
|
|
||||||
const tensor = tf.tensor(new Uint8Array(data.filter((_, i) => i % 4 === 3)), [1, 28, 28, 1])
|
const tensor = tf.tensor(new Uint8Array(data.filter((_, i) => i % 4 === 3)), [1, 28, 28, 1])
|
||||||
const prediction = model.predict(tensor)
|
const prediction = model.predict(tensor)
|
||||||
const result = await prediction.data()
|
const result = await prediction.data()
|
||||||
const guessed = result.indexOf(1)
|
const guessed = result.indexOf(1)
|
||||||
console.log(guessed)
|
console.log(guessed)
|
||||||
window.document.querySelector('#result').innerText = guessed
|
window.document.querySelector('#result').innerText = guessed
|
||||||
})
|
})
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -329,7 +332,7 @@ Here we need to explain a few things.
|
|||||||
|
|
||||||
Then, instead of simply passing the data to the tensor. we need to do some magic with `data.filter` in order to get only every 3rd pixel. This is because our canvas has 3 channels + 1 alpha, but we only need to know if the pixel is black or not. We do this by simply filtering for the index mod 4
|
Then, instead of simply passing the data to the tensor. we need to do some magic with `data.filter` in order to get only every 3rd pixel. This is because our canvas has 3 channels + 1 alpha, but we only need to know if the pixel is black or not. We do this by simply filtering for the index mod 4
|
||||||
|
|
||||||
```
|
```ts
|
||||||
data.filter((_, i) => i % 4 === 3)
|
data.filter((_, i) => i % 4 === 3)
|
||||||
```
|
```
|
||||||
|
|
||||||
|
@ -1,6 +1,7 @@
|
|||||||
---
|
---
|
||||||
title: 'How to search in the JAM'
|
title: 'How to search in the JAM'
|
||||||
date: '2020-12-06'
|
date: '2020-12-06'
|
||||||
|
coverImage: './images/uriel-soberanes-gCeH4z9m7bg-unsplash.jpg'
|
||||||
categories:
|
categories:
|
||||||
- 'coding'
|
- 'coding'
|
||||||
tags:
|
tags:
|
||||||
@ -21,18 +22,6 @@ We will look at the following:
|
|||||||
2. Search Accuracy & Precision
|
2. Search Accuracy & Precision
|
||||||
3. Performance & Size
|
3. Performance & Size
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![Telescope](images/uriel-soberanes-gCeH4z9m7bg-unsplash.jpg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Photo by [Uriel Soberanes](https://unsplash.com/@soberanes?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/telescope?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText)
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
We can't rely on a backend as discussed above, so the magic will happen at build time, like everything in the JAM-verse.
|
We can't rely on a backend as discussed above, so the magic will happen at build time, like everything in the JAM-verse.
|
||||||
|
|
||||||
I've decided to go with the free and open source [lunr.js](https://lunrjs.com/) which is a simple but still quite powerful search engine that can run in the client.
|
I've decided to go with the free and open source [lunr.js](https://lunrjs.com/) which is a simple but still quite powerful search engine that can run in the client.
|
||||||
@ -60,7 +49,7 @@ So I'm using [Sapper](https://sapper.svelte.dev/) for this blog so the examples
|
|||||||
|
|
||||||
First we need to aggregate all our data. In my case this means all the single pages, blog entries, projects and works. So I created a `/src/routes/search.json.js` file and got to work.
|
First we need to aggregate all our data. In my case this means all the single pages, blog entries, projects and works. So I created a `/src/routes/search.json.js` file and got to work.
|
||||||
|
|
||||||
```
|
```ts
|
||||||
import lunr from 'lunr'
|
import lunr from 'lunr'
|
||||||
|
|
||||||
import { getAll } from '../lib/wp'
|
import { getAll } from '../lib/wp'
|
||||||
@ -109,7 +98,7 @@ Now I have the "search model" ready. You can have a look: [nicco.io/search.json]
|
|||||||
|
|
||||||
It's time to integrate the search into the actual website 🚀
|
It's time to integrate the search into the actual website 🚀
|
||||||
|
|
||||||
```
|
```html
|
||||||
<script context="module">
|
<script context="module">
|
||||||
export async function preload() {
|
export async function preload() {
|
||||||
const prebuilt = await this.fetch(`/search.json`).then((res) => res.json())
|
const prebuilt = await this.fetch(`/search.json`).then((res) => res.json())
|
||||||
@ -141,29 +130,29 @@ It's time to integrate the search into the actual website 🚀
|
|||||||
$: search(needle)
|
$: search(needle)
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<input bind:value={needle} placeholder="needle" />
|
<input bind:value="{needle}" placeholder="needle" />
|
||||||
<ul>
|
<ul>
|
||||||
{#each results as result (result.ref)}
|
{#each results as result (result.ref)}
|
||||||
<SearchResult {result} />
|
<SearchResult {result} />
|
||||||
{/each}
|
{/each}
|
||||||
</ul>
|
</ul>
|
||||||
```
|
```
|
||||||
|
|
||||||
The first thing we do is load our preloaded `/search.json` and loading into an instance of `lunr`. This only need to happen once, once the index is loaded we ready to go.
|
The first thing we do is load our preloaded `/search.json` and loading into an instance of `lunr`. This only need to happen once, once the index is loaded we ready to go.
|
||||||
|
|
||||||
```
|
```ts
|
||||||
const idx = lunr.Index.load(prebuilt)
|
const idx = lunr.Index.load(prebuilt)
|
||||||
```
|
```
|
||||||
|
|
||||||
For the searching itself `lunr` has quite a [few options](https://lunrjs.com/guides/searching.html). The most relevant for me where the wildcard and fuzzy search. While wildcard is good for when we don't have completed a word yet, fuzzy helps us with typos.
|
For the searching itself `lunr` has quite a [few options](https://lunrjs.com/guides/searching.html). The most relevant for me where the wildcard and fuzzy search. While wildcard is good for when we don't have completed a word yet, fuzzy helps us with typos.
|
||||||
|
|
||||||
```
|
```ts
|
||||||
const fuzzy = idx.search(needle + '~1') // foo~1
|
const fuzzy = idx.search(needle + '~1') // foo~1
|
||||||
```
|
```
|
||||||
|
|
||||||
While not explicitly said in the docs I'm guessing they use the [Levenshtein Distance](https://en.wikipedia.org/wiki/Levenshtein_distance), which means `~1` will replace at most 1 char.
|
While not explicitly said in the docs I'm guessing they use the [Levenshtein Distance](https://en.wikipedia.org/wiki/Levenshtein_distance), which means `~1` will replace at most 1 char.
|
||||||
|
|
||||||
```
|
```ts
|
||||||
const wildcard = idx.search(needle + '*') // fo*
|
const wildcard = idx.search(needle + '*') // fo*
|
||||||
```
|
```
|
||||||
|
|
||||||
|
@ -20,18 +20,6 @@ There are numerous of plug-ins that can accomplish anything from contacts syncin
|
|||||||
|
|
||||||
Trying to be everything at the same time comes at a cost. And that is generally an experience that at least in my experience never feels polished or finished. While the Nextcloud Plug-Ins are incredibly versatile and powerful they also leave room for segmentation and you will notice it.
|
Trying to be everything at the same time comes at a cost. And that is generally an experience that at least in my experience never feels polished or finished. While the Nextcloud Plug-Ins are incredibly versatile and powerful they also leave room for segmentation and you will notice it.
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](images/pawel-nolbert-xe-ss5Tg2mo-unsplash.jpg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Cloud and Ocean
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
### The permanent alpha
|
### The permanent alpha
|
||||||
|
|
||||||
That's what using Nextcloud feels like 75% of the time. I have no insight into the company behind the project but it feels like they are chasing a release cycle for the sake of paper launching unfinished features that compromise in terms of stability and polish. The thing that bothers me the most is that they are constantly marketed as "production ready" when they clearly had not nearly enough QA.
|
That's what using Nextcloud feels like 75% of the time. I have no insight into the company behind the project but it feels like they are chasing a release cycle for the sake of paper launching unfinished features that compromise in terms of stability and polish. The thing that bothers me the most is that they are constantly marketed as "production ready" when they clearly had not nearly enough QA.
|
||||||
@ -78,7 +66,7 @@ Seafile on the other hand just had the release of it's 8th version (still in bet
|
|||||||
I had to migrate 2 things: Cal/CardDav for Calendar and Contacts and the files drive itself.
|
I had to migrate 2 things: Cal/CardDav for Calendar and Contacts and the files drive itself.
|
||||||
Spinning up a Seafile instance was a breeze as I host every single service with docker.
|
Spinning up a Seafile instance was a breeze as I host every single service with docker.
|
||||||
|
|
||||||
```
|
```bash
|
||||||
# .env
|
# .env
|
||||||
MYSQL_ROOT_PASSWORD=random
|
MYSQL_ROOT_PASSWORD=random
|
||||||
DB_HOST=db
|
DB_HOST=db
|
||||||
@ -88,8 +76,8 @@ SEAFILE_ADMIN_EMAIL=me@example.com
|
|||||||
SEAFILE_ADMIN_PASSWORD=a_very_secret_password
|
SEAFILE_ADMIN_PASSWORD=a_very_secret_password
|
||||||
```
|
```
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
version: "2.0"
|
version: '2.0'
|
||||||
|
|
||||||
services:
|
services:
|
||||||
db:
|
db:
|
||||||
@ -129,13 +117,13 @@ Since Seafile focuses only on the "Drive" component I had to migrate my contacts
|
|||||||
|
|
||||||
You can find my [Radicale docker image here](https://github.com/cupcakearmy/docker-radicale), maybe you find it useful. It supports bcrypt passwords and can be deployed with just the env variables for `USER` and `PASSWORD`. It has been tested with the iOS and macOS native clients.
|
You can find my [Radicale docker image here](https://github.com/cupcakearmy/docker-radicale), maybe you find it useful. It supports bcrypt passwords and can be deployed with just the env variables for `USER` and `PASSWORD`. It has been tested with the iOS and macOS native clients.
|
||||||
|
|
||||||
```
|
```bash
|
||||||
# .env
|
# .env
|
||||||
USER=foo
|
USER=foo
|
||||||
PASSWORD=secret
|
PASSWORD=secret
|
||||||
```
|
```
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
# docker-compose.yml
|
# docker-compose.yml
|
||||||
version: '3.7'
|
version: '3.7'
|
||||||
|
|
||||||
|
@ -27,19 +27,24 @@ Then I quickly copied the JS tracker code in my main html template and thought t
|
|||||||
So turns out that Matomo, being widely used is of course included in many Ad-Blocker lists and therefore my stats did not work. Lets see why:
|
So turns out that Matomo, being widely used is of course included in many Ad-Blocker lists and therefore my stats did not work. Lets see why:
|
||||||
Basically all ad blockers work with lists. Those lists include pattern that if matched will be filtered out. Let's take a look at the default Matomo tracking code:
|
Basically all ad blockers work with lists. Those lists include pattern that if matched will be filtered out. Let's take a look at the default Matomo tracking code:
|
||||||
|
|
||||||
```
|
```html
|
||||||
<script type="text/javascript">
|
<script type="text/javascript">
|
||||||
var _paq = window._paq = window._paq || [];
|
var _paq = (window._paq = window._paq || [])
|
||||||
/* tracker methods like "setCustomDimension" should be called before "trackPageView" */
|
/* tracker methods like "setCustomDimension" should be called before "trackPageView" */
|
||||||
_paq.push(['trackPageView']);
|
_paq.push(['trackPageView'])
|
||||||
_paq.push(['enableLinkTracking']);
|
_paq.push(['enableLinkTracking'])
|
||||||
(function() {
|
;(function () {
|
||||||
var u="//stats.nicco.io/";
|
var u = '//stats.nicco.io/'
|
||||||
_paq.push(['setTrackerUrl', u+'matomo.php']);
|
_paq.push(['setTrackerUrl', u + 'matomo.php'])
|
||||||
_paq.push(['setSiteId', '1']);
|
_paq.push(['setSiteId', '1'])
|
||||||
var d=document, g=d.createElement('script'), s=d.getElementsByTagName('script')[0];
|
var d = document,
|
||||||
g.type='text/javascript'; g.async=true; g.src=u+'matomo.js'; s.parentNode.insertBefore(g,s);
|
g = d.createElement('script'),
|
||||||
})();
|
s = d.getElementsByTagName('script')[0]
|
||||||
|
g.type = 'text/javascript'
|
||||||
|
g.async = true
|
||||||
|
g.src = u + 'matomo.js'
|
||||||
|
s.parentNode.insertBefore(g, s)
|
||||||
|
})()
|
||||||
</script>
|
</script>
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -79,7 +84,7 @@ Luckily Apache has the famous Rewrite module, which will solve all our problems.
|
|||||||
|
|
||||||
We can create a `.htaccess` file in the root of our Matomo installation folder, to cloak our requests.
|
We can create a `.htaccess` file in the root of our Matomo installation folder, to cloak our requests.
|
||||||
|
|
||||||
```
|
```apache
|
||||||
# .htaccess
|
# .htaccess
|
||||||
RewriteEngine On
|
RewriteEngine On
|
||||||
RewriteRule ^unicorn matomo.js
|
RewriteRule ^unicorn matomo.js
|
||||||
@ -88,11 +93,11 @@ RewriteRule ^rainbow matomo.php
|
|||||||
|
|
||||||
Now if we request `https://stats.nicco.io/unicorn` we actually get the response for `https://stats.nicco.io/matomo.js` and the same for `rainbow` and `matomo.php`.
|
Now if we request `https://stats.nicco.io/unicorn` we actually get the response for `https://stats.nicco.io/matomo.js` and the same for `rainbow` and `matomo.php`.
|
||||||
|
|
||||||
```
|
```js
|
||||||
// Replace in the client
|
// Replace in the client
|
||||||
|
|
||||||
_paq.push(['setTrackerUrl', u+'matomo.php']); // Before
|
_paq.push(['setTrackerUrl', u + 'matomo.php']) // Before
|
||||||
_paq.push(['setTrackerUrl', u+'rainbow']); // After
|
_paq.push(['setTrackerUrl', u + 'rainbow']) // After
|
||||||
|
|
||||||
g.src = u + 'matomo.js' // Before
|
g.src = u + 'matomo.js' // Before
|
||||||
g.src = u + 'unicorn' // After
|
g.src = u + 'unicorn' // After
|
||||||
@ -102,7 +107,7 @@ g.src = u + 'unicorn' // After
|
|||||||
|
|
||||||
I had to create a minuscule `Dockerfile` as the `Rewrite` module is not enabled per default in the standard Matomo docker image.
|
I had to create a minuscule `Dockerfile` as the `Rewrite` module is not enabled per default in the standard Matomo docker image.
|
||||||
|
|
||||||
```
|
```Dockerfile
|
||||||
# Dockerfile
|
# Dockerfile
|
||||||
FROM matomo
|
FROM matomo
|
||||||
RUN a2enmod rewrite
|
RUN a2enmod rewrite
|
||||||
@ -118,9 +123,9 @@ Now as you can see it's incredibly easy to mask tracking stuff, and I bet there
|
|||||||
|
|
||||||
The `Dockerfile` and the `.htaccess` files are shown above.
|
The `Dockerfile` and the `.htaccess` files are shown above.
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
# docker-compose.yml
|
# docker-compose.yml
|
||||||
version: "3.7"
|
version: '3.7'
|
||||||
|
|
||||||
networks:
|
networks:
|
||||||
traefik:
|
traefik:
|
||||||
@ -149,13 +154,13 @@ services:
|
|||||||
- traefik.docker.network=traefik
|
- traefik.docker.network=traefik
|
||||||
- traefik.port=80
|
- traefik.port=80
|
||||||
- traefik.backend=matomo
|
- traefik.backend=matomo
|
||||||
- "traefik.frontend.rule=Host:stats.nicco.io;"
|
- 'traefik.frontend.rule=Host:stats.nicco.io;'
|
||||||
networks:
|
networks:
|
||||||
- traefik
|
- traefik
|
||||||
- default
|
- default
|
||||||
```
|
```
|
||||||
|
|
||||||
```
|
```bash
|
||||||
# .env
|
# .env
|
||||||
MYSQL_DATABASE=matomo
|
MYSQL_DATABASE=matomo
|
||||||
MYSQL_USER=matomo
|
MYSQL_USER=matomo
|
||||||
|
@ -25,7 +25,7 @@ For monitoring we will use [Uptime Kuma](https://github.com/louislam/uptime-kuma
|
|||||||
|
|
||||||
First we need to [instal docker](https://docs.docker.com/engine/install/debian/#install-using-the-repository)
|
First we need to [instal docker](https://docs.docker.com/engine/install/debian/#install-using-the-repository)
|
||||||
|
|
||||||
```
|
```bash
|
||||||
curl -fsSL https://download.docker.com/linux/debian/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
|
curl -fsSL https://download.docker.com/linux/debian/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
|
||||||
|
|
||||||
echo \
|
echo \
|
||||||
@ -38,7 +38,7 @@ apt install docker-ce docker-ce-cli containerd.io docker-compose-plugin
|
|||||||
|
|
||||||
Also we want some basic firewall
|
Also we want some basic firewall
|
||||||
|
|
||||||
```
|
```bash
|
||||||
apt install ufw
|
apt install ufw
|
||||||
ufw allow 80
|
ufw allow 80
|
||||||
ufw allow 443
|
ufw allow 443
|
||||||
@ -65,7 +65,7 @@ We only need a `docker-compose.yaml` file now and we should be up and running. I
|
|||||||
|
|
||||||
Lets start with Traefik. It will handle all our routing and TLS certificates. Remember to change the acme email down in the `traefik.yaml`
|
Lets start with Traefik. It will handle all our routing and TLS certificates. Remember to change the acme email down in the `traefik.yaml`
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
version: '3.8'
|
version: '3.8'
|
||||||
|
|
||||||
networks:
|
networks:
|
||||||
@ -78,36 +78,36 @@ services:
|
|||||||
image: traefik:2.8
|
image: traefik:2.8
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
ports:
|
ports:
|
||||||
- "80:80"
|
- '80:80'
|
||||||
- "443:443"
|
- '443:443'
|
||||||
volumes:
|
volumes:
|
||||||
- /var/run/docker.sock:/var/run/docker.sock
|
- /var/run/docker.sock:/var/run/docker.sock
|
||||||
- ./traefik.yaml:/etc/traefik/traefik.yaml:ro
|
- ./traefik.yaml:/etc/traefik/traefik.yaml:ro
|
||||||
- ./data:/data
|
- ./data:/data
|
||||||
labels:
|
labels:
|
||||||
- "traefik.enable=true"
|
- 'traefik.enable=true'
|
||||||
|
|
||||||
# HTTP to HTTPS redirection
|
# HTTP to HTTPS redirection
|
||||||
- "traefik.http.routers.http_catchall.rule=HostRegexp(`{any:.+}`)"
|
- 'traefik.http.routers.http_catchall.rule=HostRegexp(`{any:.+}`)'
|
||||||
- "traefik.http.routers.http_catchall.entrypoints=insecure"
|
- 'traefik.http.routers.http_catchall.entrypoints=insecure'
|
||||||
- "traefik.http.routers.http_catchall.middlewares=https_redirect"
|
- 'traefik.http.routers.http_catchall.middlewares=https_redirect'
|
||||||
- "traefik.http.middlewares.https_redirect.redirectscheme.scheme=https"
|
- 'traefik.http.middlewares.https_redirect.redirectscheme.scheme=https'
|
||||||
- "traefik.http.middlewares.https_redirect.redirectscheme.permanent=true"
|
- 'traefik.http.middlewares.https_redirect.redirectscheme.permanent=true'
|
||||||
```
|
```
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
#Define HTTP and HTTPS entrypoints
|
# Define HTTP and HTTPS entrypoints
|
||||||
entryPoints:
|
entryPoints:
|
||||||
insecure:
|
insecure:
|
||||||
address: ":80"
|
address: ':80'
|
||||||
secure:
|
secure:
|
||||||
address: ":443"
|
address: ':443'
|
||||||
|
|
||||||
#Dynamic configuration will come from docker labels
|
#Dynamic configuration will come from docker labels
|
||||||
providers:
|
providers:
|
||||||
docker:
|
docker:
|
||||||
endpoint: "unix:///var/run/docker.sock"
|
endpoint: 'unix:///var/run/docker.sock'
|
||||||
network: "proxy"
|
network: 'proxy'
|
||||||
exposedByDefault: false
|
exposedByDefault: false
|
||||||
|
|
||||||
#Enable acme with http file challenge
|
#Enable acme with http file challenge
|
||||||
@ -123,7 +123,7 @@ certificatesResolvers:
|
|||||||
|
|
||||||
To get traefik running we just need to type the following
|
To get traefik running we just need to type the following
|
||||||
|
|
||||||
```
|
```bash
|
||||||
docker network create proxy
|
docker network create proxy
|
||||||
docker compose up -d
|
docker compose up -d
|
||||||
```
|
```
|
||||||
@ -132,7 +132,7 @@ docker compose up -d
|
|||||||
|
|
||||||
The compose file for Kuma is compact. Don't forget to change the domain to yours.
|
The compose file for Kuma is compact. Don't forget to change the domain to yours.
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
version: '3.8'
|
version: '3.8'
|
||||||
|
|
||||||
networks:
|
networks:
|
||||||
@ -147,10 +147,10 @@ services:
|
|||||||
volumes:
|
volumes:
|
||||||
- ./data:/app/data
|
- ./data:/app/data
|
||||||
labels:
|
labels:
|
||||||
- traefik.enable=true
|
- traefik.enable=true
|
||||||
- traefik.http.routers.kuma.rule=Host(`status.example.org`)
|
- traefik.http.routers.kuma.rule=Host(`status.example.org`)
|
||||||
- traefik.http.routers.kuma.entrypoints=secure
|
- traefik.http.routers.kuma.entrypoints=secure
|
||||||
- traefik.http.routers.kuma.tls.certresolver=le
|
- traefik.http.routers.kuma.tls.certresolver=le
|
||||||
```
|
```
|
||||||
|
|
||||||
Now you can navigate to your new monitoring website and create and admin account and setup monitors, alert systems and so on.
|
Now you can navigate to your new monitoring website and create and admin account and setup monitors, alert systems and so on.
|
||||||
|
@ -22,18 +22,6 @@ Overview
|
|||||||
4. [lists / arrays](#lists)
|
4. [lists / arrays](#lists)
|
||||||
5. [complex data types handling](#complex)
|
5. [complex data types handling](#complex)
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](images/jonathan-chng-HgoKvtKpyHA-unsplash-scaled-1.jpg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Photo by [Jonathan Chng](https://unsplash.com/@jon_chng?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/run?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText)
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
Lets assume we want to run the following python code in rust.
|
Lets assume we want to run the following python code in rust.
|
||||||
|
|
||||||
```
|
```
|
||||||
@ -54,19 +42,19 @@ Lets see the steps we need to take to achieve this:
|
|||||||
|
|
||||||
First lets create a new rust project by running:
|
First lets create a new rust project by running:
|
||||||
|
|
||||||
```
|
```bash
|
||||||
cargo new rust_in_python
|
cargo new rust_in_python
|
||||||
```
|
```
|
||||||
|
|
||||||
Then lets rename `src/main.rs` to `src/lib.rs` as we want a library and not standalone program.
|
Then lets rename `src/main.rs` to `src/lib.rs` as we want a library and not standalone program.
|
||||||
|
|
||||||
```
|
```bash
|
||||||
mv src/main.rs src/lib.rs
|
mv src/main.rs src/lib.rs
|
||||||
```
|
```
|
||||||
|
|
||||||
Now we simply write a hello world function in rust
|
Now we simply write a hello world function in rust
|
||||||
|
|
||||||
```
|
```rust
|
||||||
#[no_mangle]
|
#[no_mangle]
|
||||||
fn hello() {
|
fn hello() {
|
||||||
println!("Hello from rust 👋");
|
println!("Hello from rust 👋");
|
||||||
@ -77,20 +65,20 @@ For every function that need to be available to other languages (in our case Pyt
|
|||||||
|
|
||||||
The last step is to tell rust to compile to a dynamic library. To do so simply add the following to your `Cargo.toml` config file.
|
The last step is to tell rust to compile to a dynamic library. To do so simply add the following to your `Cargo.toml` config file.
|
||||||
|
|
||||||
```
|
```toml
|
||||||
[lib]
|
[lib]
|
||||||
crate-type = ["dylib"]
|
crate-type = ["dylib"]
|
||||||
```
|
```
|
||||||
|
|
||||||
Now we are ready to build 🚀
|
Now we are ready to build 🚀
|
||||||
|
|
||||||
```
|
```bash
|
||||||
cargo build --release
|
cargo build --release
|
||||||
```
|
```
|
||||||
|
|
||||||
Now just create a `main.py` file and we can import and run our function.
|
Now just create a `main.py` file and we can import and run our function.
|
||||||
|
|
||||||
```
|
```py
|
||||||
from ctypes import CDLL
|
from ctypes import CDLL
|
||||||
|
|
||||||
lib = CDLL("target/release/librust_in_python.dylib")
|
lib = CDLL("target/release/librust_in_python.dylib")
|
||||||
@ -99,7 +87,7 @@ lib.hello()
|
|||||||
|
|
||||||
And if you run it you will be greeted from rust. No need to install, the `ctypes` package is included the standard python library.
|
And if you run it you will be greeted from rust. No need to install, the `ctypes` package is included the standard python library.
|
||||||
|
|
||||||
```
|
```bash
|
||||||
python main.py
|
python main.py
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -111,7 +99,7 @@ Before we start I would like to remind you that python is untyped whereas rust o
|
|||||||
|
|
||||||
First lets write the simple add function in rust
|
First lets write the simple add function in rust
|
||||||
|
|
||||||
```
|
```rust
|
||||||
#[no_mangle]
|
#[no_mangle]
|
||||||
fn add(a: f64, b: f64) -> f64 {
|
fn add(a: f64, b: f64) -> f64 {
|
||||||
return a + b;
|
return a + b;
|
||||||
@ -120,13 +108,13 @@ fn add(a: f64, b: f64) -> f64 {
|
|||||||
|
|
||||||
Don't forget to build again 😉
|
Don't forget to build again 😉
|
||||||
|
|
||||||
```
|
```bash
|
||||||
cargo build --release
|
cargo build --release
|
||||||
```
|
```
|
||||||
|
|
||||||
Now to the python part
|
Now to the python part
|
||||||
|
|
||||||
```
|
```py
|
||||||
from ctypes import CDLL, c_double
|
from ctypes import CDLL, c_double
|
||||||
|
|
||||||
lib = CDLL("target/release/librust_in_python.dylib")
|
lib = CDLL("target/release/librust_in_python.dylib")
|
||||||
@ -158,7 +146,7 @@ So what about lists? Unfortunately I have not found a way to use Vectors for dyn
|
|||||||
|
|
||||||
###### Rust
|
###### Rust
|
||||||
|
|
||||||
```
|
```rust
|
||||||
#[no_mangle]
|
#[no_mangle]
|
||||||
fn sum(arr: [i32; 5]) -> i32 {
|
fn sum(arr: [i32; 5]) -> i32 {
|
||||||
let mut total: i32 = 0;
|
let mut total: i32 = 0;
|
||||||
@ -171,7 +159,7 @@ fn sum(arr: [i32; 5]) -> i32 {
|
|||||||
|
|
||||||
###### Python
|
###### Python
|
||||||
|
|
||||||
```
|
```python
|
||||||
from ctypes import CDLL, c_int
|
from ctypes import CDLL, c_int
|
||||||
|
|
||||||
lib = CDLL("target/release/librust_in_python.dylib")
|
lib = CDLL("target/release/librust_in_python.dylib")
|
||||||
@ -191,7 +179,7 @@ Often it can be very useful to send and/or receive data in a structured, compact
|
|||||||
|
|
||||||
###### Rust
|
###### Rust
|
||||||
|
|
||||||
```
|
```rust
|
||||||
#[repr(C)]
|
#[repr(C)]
|
||||||
pub struct Point {
|
pub struct Point {
|
||||||
pub x: f64,
|
pub x: f64,
|
||||||
@ -206,7 +194,7 @@ fn greet_point(p: Point) {
|
|||||||
|
|
||||||
###### Python
|
###### Python
|
||||||
|
|
||||||
```
|
```python
|
||||||
from ctypes import CDLL, Structure, c_double
|
from ctypes import CDLL, Structure, c_double
|
||||||
|
|
||||||
lib = CDLL("target/release/librust_in_python.dylib")
|
lib = CDLL("target/release/librust_in_python.dylib")
|
||||||
|
@ -12,18 +12,6 @@ So you ever wondered why your docker build takes so long to startup when all you
|
|||||||
|
|
||||||
Fear no more! `.dockerignore` to the rescue ⚓️.
|
Fear no more! `.dockerignore` to the rescue ⚓️.
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](images/thomas-kelley-t20pc32VbrU-unsplash-scaled-1.jpg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Photo by [Thomas Kelley](https://unsplash.com/@thkelley?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/whale?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText)
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
Whenever you build a docker image the first thing you will always see is the following:
|
Whenever you build a docker image the first thing you will always see is the following:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
|
@ -18,18 +18,6 @@ We will look at a few ways how to detect and handle dark modes in 2020:
|
|||||||
2. [JS](#js)
|
2. [JS](#js)
|
||||||
3. [React](#react)
|
3. [React](#react)
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](images/davisco-5E5N49RWtbA-unsplash-scaled-1.jpg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Photo by [davisco](https://unsplash.com/@codytdavis?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/contrast?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText)
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
## Pure CSS
|
## Pure CSS
|
||||||
|
|
||||||
First lets have a look how we can do this using only CSS. There is a new css media query that is supported by [almost any browser](https://caniuse.com/#feat=prefers-color-scheme) right now.
|
First lets have a look how we can do this using only CSS. There is a new css media query that is supported by [almost any browser](https://caniuse.com/#feat=prefers-color-scheme) right now.
|
||||||
|
@ -3,6 +3,8 @@ title: 'Why I love JS but sometimes I feel we shoot ourself in the foot.'
|
|||||||
date: '2020-05-29'
|
date: '2020-05-29'
|
||||||
categories:
|
categories:
|
||||||
- 'general'
|
- 'general'
|
||||||
|
tags:
|
||||||
|
- rant
|
||||||
---
|
---
|
||||||
|
|
||||||
Let's start by saying this: I absolutely love JS & Typescript, they are my favourite languages and I would not want to live without them.
|
Let's start by saying this: I absolutely love JS & Typescript, they are my favourite languages and I would not want to live without them.
|
||||||
|
@ -17,18 +17,6 @@ So maybe one of you is thinking the same thing. Why [Svelte](https://svelte.dev/
|
|||||||
|
|
||||||
This is not a tutorial, for that check the amazing [official tutorial](https://svelte.dev/tutorial/basics) which teaches you everything from the basics to more advanced stuff.
|
This is not a tutorial, for that check the amazing [official tutorial](https://svelte.dev/tutorial/basics) which teaches you everything from the basics to more advanced stuff.
|
||||||
|
|
||||||
<figure>
|
|
||||||
|
|
||||||
![](images/alessandra-caretto-cAY9X4rPG3g-unsplash.jpg)
|
|
||||||
|
|
||||||
<figcaption>
|
|
||||||
|
|
||||||
Photo by [Alessandra Caretto](https://unsplash.com/@alessandracaretto?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText) on [Unsplash](https://unsplash.com/s/photos/wheel?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText)
|
|
||||||
|
|
||||||
</figcaption>
|
|
||||||
|
|
||||||
</figure>
|
|
||||||
|
|
||||||
Why is there a wheel? Well quite simply: Svelte IS reinventing the wheel.
|
Why is there a wheel? Well quite simply: Svelte IS reinventing the wheel.
|
||||||
You see: what are most of the frameworks doing nowadays? They provide a way to code user interfaces with a component approach. This makes a lot of sense because we can reuse them, they are boxed items that stand for them self. Modularizing and splitting up concerns make big apps easier to write and afterwards maintain.
|
You see: what are most of the frameworks doing nowadays? They provide a way to code user interfaces with a component approach. This makes a lot of sense because we can reuse them, they are boxed items that stand for them self. Modularizing and splitting up concerns make big apps easier to write and afterwards maintain.
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user