Albrecht von Lucke bei Jung und Naiv, hier ab 64min mal 10 Minuten reinhören, lohnt sich:
Software Development in year 2018
Interested in the current state of the art (in real world)? Read this:
https://news.ycombinator.com/item?id=18442637
Favorite comment so far:
“We have absolutely no idea how to write code. I always wonder if it’s like this for other branches of engineering too? I wonder if engineers who designed my elevator or airplane had “ok it’s very surprising that it’s working, let’s not touch this” moments. Or chemical engineers synthesize medicines in way nobody but a rockstar guru understands but everyone changes all the time. I wonder if my cellphone is made by machines designed in early 1990s because nobody was able to figure out what that one cog is doing.
Software is a mess. I’ve seen some freakishly smart people capable of solving very hard problems writing code that literally changes the world at this very moment. But the code itself is, well, a castle of shit. Why? Is it because our tools (programming languages, compilers etc) are still stone age technology? Is it because software is inherently a harder problem than say machines or chemical processes for the human brain? Is it because software engineers are less educated than other engineers? ”
Measure your web page with web.dev
New google service gives hints on how to improve your website.
It also offers learning resources under
Looks useful, though it is still beta …
Posthum – Cologne Fake Art
What to do with apache logs?
Two simple things are easily achievable .
- Loading logs into a log file analyser https://matomo.org/
- Depersonalize
An example setup on Ubuntu is shown below.
Configure two daily running cronjobs
0 1 * * * /scripts/import-logfiles.sh
0 2 * * * /scripts/depersonalize-apache-logs.sh
Use import-logfiles.sh to load all server requests into the matomo database. Use depersonalize-apache-logs.sh to anonymize all logs older than seven days. Depersonalization is achieved by replacing the last two bytes of IP-adresses with 0.
Both scripts work on a default Ubuntu setup of apache2. Apache Logfiles are compressed and end with ‘gz’. They are placed in ‘/var/log/apache2’ and start with the prefix ‘localhost-access.’
Kölner Grünsystem – Öffentlicher Empfang im Rathaus
Am kommen Dienstag den 13.11. findet ab 16:30 ein Empfang zum Thema „Kölner Grünsystem“ anlässlich des Europäischen Kulturerbejahres 2018 im Rathaus statt – um auf die aktuelle Gefährdung der Gleueler Wiese hinzuweisen wird die BI “Grüngürtel für alle” ab 15:30 mit einer stillen Mahnwache vor Ort sein.
Gemeinsame Stellungnahme NABU / BUND zur Mitarbeit bei „StadtGrün naturnah“
Stop listening
Google Home devices and Apple HomePod both have voice commands to mute the microphone from across the room — “OK Google, mute the microphone” and “Hey Siri, stop listening” — but not Amazon Echo devices.
On Amazon Echo devices you have to push a mute button. But this will not help against Amazon Echo Remote devices. The remote microphone (which is push-for-use and not always on) remains available even if the main unit has the microphone array disabled.
https://www.androidcentral.com/how-disable-microphone-amazon-echo
https://www.howtogeek.com/237397/how-to-stop-your-amazon-echo-from-listening-in/
Farming robots
HTTP/3
“The protocol that’s been called HTTP-over-QUIC for quite some time has now changed name and will officially become HTTP/3. This was triggered by this original suggestion by Mark Nottingham.”
On Github – A web based mission control framework
Was Vorratsdatenspeicherung bedeutet
…lässt dieser Vortrag von 2016 ahnen, in dem 2,5 Jahre Spiegel Online ausgewertet werden.
Wenn Kopierer nicht kopieren…
…ist vermutlich Software im Spiel.
Sehr sehenswerter/unterhaltsamer Vortrag aus dem Jahre 2014. Wer es noch nicht kennt…
“Kopierer, die spontan Zahlen im Dokument verändern: Im August 2013 kam heraus, dass so gut wie alle Xerox-Scankopierer beim Scannen Zahlen und Buchstaben einfach so durch andere ersetzen. Da man solche Fehler als Benutzer so gut wie nicht sehen kann, ist der Bug extrem gefährlich und blieb lange unentdeckt: Er existiert über acht Jahre in freier Wildbahn.”
Google Dataset Search
Mit Google Dataset Search stellt Google eine Suchmaschine für Forschungsdaten zur Verfügung. Bei der Indexierung soll vor allem schema.org zum Einsatz kommen. Anbietern von Forschungsdaten stellt Google ein Tool zur Verfügung, mit dem die eigenen Metadaten getestet werden können. Eine ausführlicher Artikel zu der neuen Suchmaschine ist im Google Blog zu finden: http://ai.googleblog.com/2018/09/building-google-dataset-search-and.html
Playing Doom
Nice browser version of the all time classic
Karies und Euternase im G9
URL encoding in Java.
Here is, how I encode URLs in Java.
- Split URL into structural parts. Use java.net.URL for it.
- Encode each part properly
- Use
IDN.toASCII(putDomainNameHere)
to Punycode encode the host name! - Use
java.net.URI.toASCIIString()
to percent-encode, NFC encoded unicode – (better would be NFKC!). For more info see: How to encode properly this URLURL url= new URL("http://search.barnesandnoble.com/booksearch/first book.pdf); URI uri = new URI(url.getProtocol(), url.getUserInfo(), IDN.toASCII(url.getHost()), url.getPort(), url.getPath(), url.getQuery(), url.getRef()); String correctEncodedURL=uri.toASCIIString(); System.out.println(correctEncodedURL);
Prints
http://search.barnesandnoble.com/booksearch/first%20book.pdf
Was tun mit Apache Logfiles?
Zwei Dinge, die gut umzusetzen sind:
- In ein Analysetool laden. z.B. https://matomo.org/
- Anonymisieren
Hier ein Beispiel Setup auf einem Ubunturechner.
Es werden zwei täglich laufende Cronjobs angelegt
0 1 * * * /scripts/import-logfiles.sh
0 2 * * * /scripts/depersonalize-apache-logs.sh
Mit dem `import-logfiles.sh` Skript werden nun einmal täglich die Serverrequests vom letzten Tag in die Matomodatenbank geladen. Mit dem `depersonalize-apache-logs.sh` werden alle Logfiles, die älter als sieben Tage sind bearbeitet. Dabei werden für alle IPs die letzten zwei Bytes auf 0 gesetzt.
Beide Beispielskripte lassen sich auf Github finden. Beide Skripte gehen dabei davon aus, dass der Apache seine Logfiles rolliert und zu `gz` Dateien komprimiert, die unter `/var/log/apache2` liegen und immer mit dem String “other_vhosts_access.” beginnen.
Backup MySQL
This is how I backup MySQL databases.
Solution consists of three files
The backup-db.sh is called every day with Parameter -b for backup and with parameter -c for cleanup. The backup routine dumps all MySQL databases into one file. The cleanup routine removes old backups (older than 30 days, see LIMIT variable). The restore routine requires to hardcode a certain snapshot into the script. The variables.conf must be placed in same folder as backup-db.sh .
This is how the backup folder looks after running the script for 32 days at 23h.
20181009-023001.sql 20181020-023001.sql 20181031-023001.sql
20181010-023001.sql 20181021-023001.sql 20181101-023001.sql
20181011-023001.sql 20181022-023001.sql 20181102-023001.sql
20181012-023001.sql 20181023-023001.sql 20181103-023001.sql
20181013-023001.sql 20181024-023001.sql 20181104-023001.sql
20181014-023001.sql 20181025-023001.sql 20181105-023001.sql
20181015-023001.sql 20181026-023002.sql 20181106-023001.sql
20181016-023001.sql 20181027-023001.sql 20181107-023001.sql
20181017-023001.sql 20181028-023001.sql 20181108-023001.sql
20181018-023001.sql 20181029-023002.sql
20181019-023001.sql 20181030-023001.sql
The folder will contain backups for the last 31 days.
Finally – Github allows you to delete issues
This is good news:
https://github.com/isaacs/github/issues/253#issuecomment-436721235
Find more info at Github Changelog
My own participation in the discussion is manifested here
https://github.com/jschnasse/issues-and-force-pushes/issues/1
The repo demonstrates how issues can establish permanent links to overwritten material.
Track user movement by WiFi transmissions
As authors of “Adversarial WiFi Sensing” state in their conclusion:
“Our work brings up an inconvenient truth about wireless transmissions. While greatly improving our everyday life, they also unknowingly reveal information about ourselves and our actions. By designing a simple and powerful attack, we show that bad actors outside of a building can secretly track user presence and movement inside the building by just passively listening to ambient WiFi transmissions (even if they are encrypted). To defend against these attacks, we must control the volume and coverage of WiFi signals, or ask APs to obfuscate signals using cover traffic.
While our attack targets WiFi localization and tracking, our methodology can be generalized to sensing mechanisms at different RF frequencies (e.g.UHF, cellular, millimeter wave [63]) and other mediums (acoustic [32], ultrasound [47, 61], visible light, magnetics). Beyond this single attack,we hope to highlight largely overlooked privacy risks from ambient RF (and other) signals around us.”