A faux thought stream
There isn't much right now but it's coming! Think of this as a Facebook timeline but with instant
Registry (BRCL pt 2)
So I published what I had so far and now I'm waiting for registration requests to pop in. It seems to have been well received so far. Here's the link to the registry.
BRCL (AKA 3-letter-codes must go)
For a while now I have been thinking of starting up a registry of constructed languages using a four-letter code that was co-developed/proposed by some interested parties and me a few years ago.
There is currently an initiative that assigns ISO 639-3 lang. codes to constructed languages out of the
reserved for private use section of the code space but that means conlangs will always play second fiddle in a system originally designed for a different purpose (cataloguing all natural languages).
Furthermore, while there are according to some estimates ~6~8K natural languages, if conlanging becomes a more popular pastime, we must acknowledge the fact that it's not rare to see a single person developing a multitude of conlangs. With just 4000 conlangers around the world, making 8 conlangs each, we'd already have used up 32 000 codes, which is already beyond what a three-letter alphabetic code could support.
With this in mind, the four-letter code system I sketched out specifically for conlangs accomodates 456 976 combinations using 26 basic Latin characters. The code additionally permits the apostrophe
' and has special combinations for language names that are shorter, giving a maximum of 594 216 valid codes. If we take our example with 4000 active conlangers above, each of them could register 148 distinct language codes!
It doesn't end there. Since there probably are more than 4000 conlangers out there (a quick skim of the major Facebook groups and social fora, rapidly surpasses 4000 individuals, even with duplicate membership), the code system encourages the registration of so-called macrolanguages. These are interesting denotations for languages that belong genetically together tightly. As such, the many varieties of say, Tolkien's elven language Sindarin could be encoded in one such code
sind and further be specified by what is known as the Extended-Identity-Tag (EIT) which is an optionally specified, mandatorily supported three-letter code. In this extended distinction mark-up, the Falathrin variety of Sindarin would be written as
sind-fal. The EIT being three-character long permits distinguishing 19683 different varieties of a macrolanguage, which I suspect should be quite enough for most use cases.
I will detail more examples in the future as time materialises. Stay tuned!
Openshift hurts my soul (AKA pluralise with care)
Being new to the platform, we both made a typo that eluded us for quite a while. I followed the guide and ran
$ oc login blabla:8443 $ oc new-project tepro $ oc project tepro $ oc create serviceaccount robot $ oc serviceaccounts get-token robot | xclip $ oc policy add-role-to-user view system:serviceaccounts:robot role "view" added: "system.serviceaccounts:robot"
THIS IS WRONG. Notice that final s in
system:serviceaccounts. This is wrong.
An user added this way will show up in the web console as a
system user and not a
service account. When you then try running an API request using the account token on say, listing all pods in the project, you'll get a 403 Forbidden. Siiigh...
The correct way of doing it requires you to scope it as
system:serviceaccount:..., i.e. run
$ oc policy add-role-to-user view system:serviceaccount:robot role "view" added: "system:serviceaccount:robot"
which in retrospect makes sense. OTOH, Openshift lets you add the role without any warning that __maybe__, just __maybe__ you meant to add the role to a serviceaccount.
We figured it out with the expert help of one of our resident openshift experts whose question “But why is it a system account?” got us on the right suspicion track.
On Fat Tuesday “fettisdagen”, Swedes celebrate by eating a particularly calorie-rich pastry known as a semla. They, unlike Monads, but like Erlang are great.
Mule, data encoding, headaches
It turns out that sending UTF-8-encoded string data across a transport boundary and then attempting to capture the byte array straight into a ISO 8859-1 string leads only to a headache (and funny symbols!).
So instead of using a
byte-array-to-string transformer with
encoding set to ISO 8859-1 rather use a chain of transformers :
encoding="UTF-8" followed by
encoding="ISO 8859-1" leads to happiness!
Using rsync to update this website
So instead of manually synchronising my local
out/ dir that Ivy generates (yish!) I just wrote the world's simplest shell script to gen and publish.
ivy build -c rsync -av -rsh=ssh ./out/ user@domain:/www-data-dir/
Save it as sync, set
chmod +x sync and run
./sync from the ivy folder whenever you want to clean, build, synchronise everything. :-)
Má là (麻辣) && hot pot
I went out with Johanna to a (to-us) new restaurant Restaurang Formosa that has apparently been around since the late 60s in some form or other and had a lovely hot pot menu. Several kinds of finely sliced meats, vegetables, noodles and seafood on a split hot pot with one side replete with “mild”, flavorful broth and the other HOT. Foolishly, I decided to only consume from the chili-red broth as Johanna isn't the world's greatest fan of chili or spicy food. One hour and a half later, I realised my grave mistake.
Nevertheless, we both had a lovely time with me giving up near the end after having eaten around 10 chili peppers fished out of my broth and more Sichuan peppercorns than I ever had in a single sitting. A strong recommend!
Trying out Ivy...
So I figured I would try out Ivy to as a static website generator as I have a strong allergy to CSS, web standards (living or not), and DOM hackery. We'll see if it's any good for my porpoises.
So far, I'm liking the code highlighting (using Pygments) and the command
p ("hello world\n")
$ ivy build Rendered: 8 · Written: 1 · Time: 0.27 sec · Avg: 0.0334 sec/page