I thought of adding another Proxmox node to the cluster. Instead of having PBS on separate physical box I wanted to have it virtualized, same as on any other environment I setup. So I installed fresh copy of the same Proxmox VE version and tried to join the cluster. And then this message came: And plenty other regarding hostname, SSH keys, quorum etc. Finally Proxmox UI went broke as I was unable to sign in. Restarting cluster services ended with some meaningless information. So I was a little bit worried about the situation. Of cource I got backup of all
If you have OpenHAB on Proxmox or any other virtualization and it sometimes fails to grab RTSP stream and create snapshots, then there is high chance that everything is fine with the camera and network and the problem is within your server hardware. I was investigating this matter a lot and came into this simple conclusion. If camera does not have built snapshot URL coming from ONVIF (like on EasyCam WiFi with both ONVIF and Tuya) then your OpenHAB will try to make one from RTSP stream with ffmpeg. It starts ffmpeg process which will periodically (in my case every
From 8k to 29k writes per second We took IP2Location version DB11 database. It holds few millions of IPv4 ranges which should unwrap onto over 2 billion addresses. Such number of entries is actually not a big deal for PostgreSQL RDBMS or Apache Cassandra distributed databases system. However there is an issue of ingestion speed. The question is how quick I can programmatically compute IP addresses for IP ranges and insert them in persistant storage. PostgreSQL can hold easily around 10TB of data in single node. It can hold even more especially if divided into separate partitions/tables or use multiple
Ever wondered what can be the highest load average on the unix-like system? Do we even know what this parameter tells about? It shows the average number of either actively running or waiting processes. It should be close to the number of logical processors present on the system, otherwise, in case it is greater than this, some things will need to wait in order to be executed. So I was testing 1000 LXC containers on the 2 x 6 core Xeon system (totalling as 24 logical processors) and leave it for a while. Once I got back I saw that
If you wonder how to automate Ubuntu virtual machine creation and then deploy it to Proxmox in multple copies, then you are looking for Packer and Terraform. Side note: going for virtual machines in Proxmox is the proper way. I tried for several days to have LXC containers working, however finally I will say that it is not the best option with lot of things going bad like cgroups, AppArmor, nesting, FUSE, ingress networking etc. There is literally too much to handle with LXC and with VM there is no such problem, so discussion end here in favour of Proxmox
If you run digital services platform or critical infrastructure then most probably you are covered by NIS 2 and its requirements including those concerning information security. Even if you are not covered by NIS 2, then still you may benefit from its regulations which seem to be similar with those coming from ISO 27001. In this article I show how to automatically deploy anti-rootkit and anti-virus software for your Linux workstations and servers. TLDR By using rkhunter anti-rootkit and ClamAV anti-virus you are closer to NIS 2 and ISO 27001 and farther away from threats like cryptocurrency miners and ransomware.
Create secure, high-performance, affordable environment for your container applications using Hetzner dedicated servers.For around 50€ per month. This setup can also be done using different server providers, both dedicated and shared, even on public cloud. This tutorial has not been sponsored by Hetzner or any other software vendors. If you are interested in similar setup, please drop me a message via Linkedin. Goal The goal for this setup is to run Docker container in Swarm mode in secure and reliable environment. For sake of security we enable Proxmox firewall, pfSense firewall and Suricata IDS/IPS. For sake of reliability we configure
Keep your data safe. Device loss or unauthorized access can be mitigated by encrypting drive in your server and workstation. So you may have a Ubuntu Linux installation on your bare metal or virtual machine. Does it have an encrypted drive? If the answer is no, then you could be in trouble when device is stolen or lost, or someone just gained unauthorized access to your hardware. In this short step-by-step article you can see what steps your should take to encrypt your unencrypted drives without need to reinstall the system. When speaking workstation, there is much less concern about
Backup is a must. Primary hardware fails. Local backups can also fail or can be inaccessible. Remote backups can also fail, but if you have 2, 3 or even more backup copies in different places and on various medium chances are high enough that you will survive major incidents without data loss or too much of being offline. Talking about Microsoft Azure public cloud platform. But in case of any infrastructure environment you should have working and verified backup tools. Azure has its own. To keep those backups in secure remote place (in the context of Storage Account) you can