mirror of
https://github.com/funkypenguin/geek-cookbook/
synced 2025-12-13 09:46:23 +00:00
Fix some unnecessary 301s
Signed-off-by: David Young <davidy@funkypenguin.co.nz>
This commit is contained in:
@@ -1,14 +1,20 @@
|
|||||||
|
|
||||||
[archivebox]: /recipes/archivebox/
|
[archivebox]: /recipes/archivebox/
|
||||||
[authelia]: /docker-swarm/authelia/
|
[authelia]: /docker-swarm/authelia/
|
||||||
[autopirate]: /recipes/autopirate/
|
[autopirate]: /recipes/autopirate/
|
||||||
[bazarr]: /recipes/autopirate/bazarr/
|
[bazarr]: /recipes/autopirate/bazarr/
|
||||||
[calibre-web]: /recipes/calibre-web/
|
[calibre-web]: /recipes/calibre-web/
|
||||||
|
[cert_aws]: https://www.credly.com/badges/a0c4a196-55ab-4472-b46b-b610b44dc00f
|
||||||
|
[cert_cka]: https://www.credly.com/badges/cd307d51-544b-4bc6-97b0-9015e40df40d
|
||||||
|
[cert_ckad]: https://www.credly.com/badges/9ed9280a-fb92-46ca-b307-8f74a2cccf1d
|
||||||
|
[cert_cks]: https://www.credly.com/badges/93fa53da-1f38-47a9-b6ee-dce6a8fad9fc
|
||||||
|
[contact]: https://www.funkypenguin.co.nz/contact
|
||||||
[cyberchef]: /recipes/cyberchef/
|
[cyberchef]: /recipes/cyberchef/
|
||||||
|
[duplicity]: /recipes/autopirate/duplicity/
|
||||||
[emby]: /recipes/emby/
|
[emby]: /recipes/emby/
|
||||||
[funkwhale]: /recipes/autopirate/funkwhale/
|
[funkwhale]: /recipes/autopirate/funkwhale/
|
||||||
[github_sponsor]: https://github.com/sponsors/funkypenguin
|
[github_sponsor]: https://github.com/sponsors/funkypenguin
|
||||||
[headphones]: /recipes/autopirate/headphones/
|
[headphones]: /recipes/autopirate/headphones/
|
||||||
|
[homeassistant]: /recipes/homeassistant/
|
||||||
[jackett]: /recipes/autopirate/jackett/
|
[jackett]: /recipes/autopirate/jackett/
|
||||||
[jellyfin]: /recipes/jellyfin/
|
[jellyfin]: /recipes/jellyfin/
|
||||||
[keycloak]: /recipes/keycloak/
|
[keycloak]: /recipes/keycloak/
|
||||||
@@ -41,10 +47,4 @@
|
|||||||
[tfa-dex-static]: /docker-swarm/traefik-forward-auth/dex-static/
|
[tfa-dex-static]: /docker-swarm/traefik-forward-auth/dex-static/
|
||||||
[tfa-google]: /docker-swarm/traefik-forward-auth/google/
|
[tfa-google]: /docker-swarm/traefik-forward-auth/google/
|
||||||
[tfa-keycloak]: /docker-swarm/traefik-forward-auth/keycloak/
|
[tfa-keycloak]: /docker-swarm/traefik-forward-auth/keycloak/
|
||||||
[tfa]: /docker-swarm/traefik-forward-auth/
|
[tfa]: /docker-swarm/traefik-forward-auth/
|
||||||
|
|
||||||
[cert_aws]: https://www.credly.com/badges/a0c4a196-55ab-4472-b46b-b610b44dc00f
|
|
||||||
[cert_cka]: https://www.credly.com/badges/cd307d51-544b-4bc6-97b0-9015e40df40d
|
|
||||||
[cert_ckad]: https://www.credly.com/badges/9ed9280a-fb92-46ca-b307-8f74a2cccf1d
|
|
||||||
[cert_cks]: https://www.credly.com/badges/93fa53da-1f38-47a9-b6ee-dce6a8fad9fc
|
|
||||||
[contact]: https://www.funkypenguin.co.nz/contact
|
|
||||||
@@ -4,7 +4,7 @@
|
|||||||
Already deployed:
|
Already deployed:
|
||||||
|
|
||||||
* [X] [Docker swarm cluster](/docker-swarm/design/) with [persistent shared storage](/docker-swarm/shared-storage-ceph/)
|
* [X] [Docker swarm cluster](/docker-swarm/design/) with [persistent shared storage](/docker-swarm/shared-storage-ceph/)
|
||||||
* [X] [Traefik](/docker-swarm/traefik) configured per design
|
* [X] [Traefik](/docker-swarm/traefik/) configured per design
|
||||||
* [X] DNS entry for the hostname you intend to use (*or a wildcard*), pointed to your [keepalived](/docker-swarm/keepalived/) IP
|
* [X] DNS entry for the hostname you intend to use (*or a wildcard*), pointed to your [keepalived](/docker-swarm/keepalived/) IP
|
||||||
|
|
||||||
Related:
|
Related:
|
||||||
|
|||||||
@@ -50,7 +50,7 @@ Running such a platform enables you to run selfhosted services such as the [Auto
|
|||||||
* [Automated backup](/recipes/elkarbackup/) of configuration and data
|
* [Automated backup](/recipes/elkarbackup/) of configuration and data
|
||||||
* [Monitoring and metrics](/recipes/swarmprom/) collection, graphing and alerting
|
* [Monitoring and metrics](/recipes/swarmprom/) collection, graphing and alerting
|
||||||
|
|
||||||
Recent updates and additions are posted on the [CHANGELOG](/CHANGELOG/), and there's a friendly community of like-minded geeks in the [Discord server](http://chat.funkypenguin.co.nz).
|
Recent updates and additions are posted on the [CHANGELOG](/changelog/), and there's a friendly community of like-minded geeks in the [Discord server](http://chat.funkypenguin.co.nz).
|
||||||
|
|
||||||
## How will this benefit me?
|
## How will this benefit me?
|
||||||
|
|
||||||
|
|||||||
@@ -16,7 +16,7 @@ This recipe presents a method to combine these tools into a single swarm deploym
|
|||||||
|
|
||||||
Tools included in the AutoPirate stack are:
|
Tools included in the AutoPirate stack are:
|
||||||
|
|
||||||
* [SABnzbd][sabnzbd] is the workhorse. It takes `.nzb` files as input (_manually or from [Sonarr](/recipes/autopirate/sonarr/), [Radarr](/recipes/autopirate/radarr/), etc_), then connects to your chosen Usenet provider, downloads all the individual binaries referenced by the .nzb, and then tests/repairs/combines/uncompresses them all into the final result - media files, to be consumed by [Plex](/recipes/plex), [Emby](/recipes/emby/), [Komga](/recipes/komga/), [Calibre-Web](/recipes/calibre-web/), etc.
|
* [SABnzbd][sabnzbd] is the workhorse. It takes `.nzb` files as input (_manually or from [Sonarr][sonarr], [Radarr][radarr], etc_), then connects to your chosen Usenet provider, downloads all the individual binaries referenced by the .nzb, and then tests/repairs/combines/uncompresses them all into the final result - media files, to be consumed by [Plex][plex], [Emby][emby], [Komga][komga], [Calibre-Web][calibre-web]), etc.
|
||||||
|
|
||||||
* [NZBGet][nzbget] downloads data from usenet servers based on .nzb definitions. Like [SABnzbd][sabnzbd], but written in C++ and designed with performance in mind to achieve maximum download speed by using very little system resources (_this is a popular alternative to SABnzbd_)
|
* [NZBGet][nzbget] downloads data from usenet servers based on .nzb definitions. Like [SABnzbd][sabnzbd], but written in C++ and designed with performance in mind to achieve maximum download speed by using very little system resources (_this is a popular alternative to SABnzbd_)
|
||||||
|
|
||||||
@@ -30,7 +30,7 @@ Tools included in the AutoPirate stack are:
|
|||||||
|
|
||||||
* [Readarr][readarr] finds, downloads, and manages eBooks
|
* [Readarr][readarr] finds, downloads, and manages eBooks
|
||||||
|
|
||||||
* [Lidarr][lidarr] is an automated music downloader for NZB and Torrent. It performs the same function as [Headphones][headphones], but is written using the same(ish) codebase as [Radarr][radarr] and [Sonarr](/recipes/autopirate/sonarr). It's blazingly fast, and includes beautiful album/artist art. Lidarr supports [SABnzbd](/recipes/autopirate/sabnzbd/), [NZBGet](/recipes/autopirate/nzbget/), Transmission, µTorrent, Deluge and Blackhole (_just like Sonarr / Radarr_)
|
* [Lidarr][lidarr] is an automated music downloader for NZB and Torrent. It performs the same function as [Headphones][headphones], but is written using the same(ish) codebase as [Radarr][radarr] and [Sonarr][sonarr]. It's blazingly fast, and includes beautiful album/artist art. Lidarr supports [SABnzbd][sabnzbd], [NZBGet][nzbget], Transmission, µTorrent, Deluge and Blackhole (_just like Sonarr / Radarr_)
|
||||||
|
|
||||||
* [Mylar][mylar] is a tool for downloading and managing digital comic books / "graphic novels"
|
* [Mylar][mylar] is a tool for downloading and managing digital comic books / "graphic novels"
|
||||||
|
|
||||||
|
|||||||
@@ -63,4 +63,4 @@ calibre-server:
|
|||||||
--8<-- "recipe-autopirate-toc.md"
|
--8<-- "recipe-autopirate-toc.md"
|
||||||
--8<-- "recipe-footer.md"
|
--8<-- "recipe-footer.md"
|
||||||
|
|
||||||
[^2]: The calibre-server container co-exists within the Lazy Librarian (LL) containers so that LL can automatically add a book to Calibre using the calibre-server interface. The calibre library can then be properly viewed using the [calibre-web](/recipes/calibre-web) recipe.
|
[^2]: The calibre-server container co-exists within the Lazy Librarian (LL) containers so that LL can automatically add a book to Calibre using the calibre-server interface. The calibre library can then be properly viewed using the [calibre-web][calibre-web] recipe.
|
||||||
|
|||||||
@@ -7,7 +7,7 @@ description: Lidarr is an automated music downloader for NZB and Torrent
|
|||||||
!!! warning
|
!!! warning
|
||||||
This is not a complete recipe - it's a component of the [autopirate](/recipes/autopirate/) "_uber-recipe_", but has been split into its own page to reduce complexity.
|
This is not a complete recipe - it's a component of the [autopirate](/recipes/autopirate/) "_uber-recipe_", but has been split into its own page to reduce complexity.
|
||||||
|
|
||||||
[Lidarr](https://lidarr.audio/) is an automated music downloader for NZB and Torrent. It performs the same function as [Headphones](/recipes/autopirate/headphones), but is written using the same(ish) codebase as [Radarr][radarr] and [Sonarr][sonarr]. It's blazingly fast, and includes beautiful album/artist art. Lidarr supports [SABnzbd][sabnzbd], [NZBGet][nzbget], Transmission, µTorrent, Deluge and Blackhole (_just like Sonarr / Radarr_)
|
[Lidarr](https://lidarr.audio/) is an automated music downloader for NZB and Torrent. It performs the same function as [Headphones][headphones], but is written using the same(ish) codebase as [Radarr][radarr] and [Sonarr][sonarr]. It's blazingly fast, and includes beautiful album/artist art. Lidarr supports [SABnzbd][sabnzbd], [NZBGet][nzbget], Transmission, µTorrent, Deluge and Blackhole (_just like Sonarr / Radarr_)
|
||||||
|
|
||||||
{ loading=lazy }
|
{ loading=lazy }
|
||||||
|
|
||||||
|
|||||||
@@ -24,7 +24,7 @@ Similar to the other backup options in the Cookbook, we can use Duplicati to bac
|
|||||||
|
|
||||||
!!! summary "Ingredients"
|
!!! summary "Ingredients"
|
||||||
*[X] [Docker swarm cluster](/docker-swarm/design/) with [persistent shared storage](/docker-swarm/shared-storage-ceph/)
|
*[X] [Docker swarm cluster](/docker-swarm/design/) with [persistent shared storage](/docker-swarm/shared-storage-ceph/)
|
||||||
* [X] [Traefik](/docker-swarm/traefik) and [Traefik-Forward-Auth](/docker-swarm/traefik-forward-auth) configured per design
|
* [X] [Traefik](/docker-swarm/traefik/) and [Traefik-Forward-Auth](/docker-swarm/traefik-forward-auth/) configured per design
|
||||||
* [X] Credentials for one of the Duplicati's supported upload destinations
|
* [X] Credentials for one of the Duplicati's supported upload destinations
|
||||||
|
|
||||||
## Preparation
|
## Preparation
|
||||||
|
|||||||
@@ -13,7 +13,7 @@ phpIPAM fulfils a non-sexy, but important role - It helps you manage your IP add
|
|||||||
|
|
||||||
## Why should you care about this?
|
## Why should you care about this?
|
||||||
|
|
||||||
You probably have a home network, with 20-30 IP addresses, for your family devices, your [IoT devices](/recipes/homeassistant), your smart TV, etc. If you want to (a) monitor them, and (b) audit who does what, you care about what IPs they're assigned by your DHCP server.
|
You probably have a home network, with 20-30 IP addresses, for your family devices, your [IoT devices][homeassistant], your smart TV, etc. If you want to (a) monitor them, and (b) audit who does what, you care about what IPs they're assigned by your DHCP server.
|
||||||
|
|
||||||
You could simple keep track of all devices with leases in your DHCP server, but what happens if your (_hypothetical?_) Ubiquity Edge Router X crashes and burns due to lack of disk space, and you loose track of all your leases? Well, you have to start from scratch, is what!
|
You could simple keep track of all devices with leases in your DHCP server, but what happens if your (_hypothetical?_) Ubiquity Edge Router X crashes and burns due to lack of disk space, and you loose track of all your leases? Well, you have to start from scratch, is what!
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# Data layout
|
# Data layout
|
||||||
|
|
||||||
The applications deployed in the stack utilize a combination of data-at-rest (_static config, files, etc_) and runtime data (_live database files_). The realtime data can't be [backed up](/recipes/duplicity) with a simple copy-paste, so where we employ databases, we also include containers to perform a regular export of database data to a filesystem location.
|
The applications deployed in the stack utilize a combination of data-at-rest (_static config, files, etc_) and runtime data (_live database files_). The realtime data can't be [backed up][duplicity] with a simple copy-paste, so where we employ databases, we also include containers to perform a regular export of database data to a filesystem location.
|
||||||
|
|
||||||
So that we can confidently backup all our data, I've setup a data layout as per the following example:
|
So that we can confidently backup all our data, I've setup a data layout as per the following example:
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user