Joey short
untitled

I encourage making websites return jwz balls when HN retitles your stuff. I assume JWZ is ok with being a CDN for this purpose.

Posted
Joey short
untitled

Hacker news adds a question mark to "The man who killed google search" to make it more "accurate" despite not having read it (or even AI summarized it I guess?)

Meanwhile, Metafilter shows me again why I love it despite itself.

Posted
Joey short
untitled

The reverse engineering of the JS they're doing is particularly amusing. Like, it contains 679 embedded javascript libraries and all of their licenses, and 1 mb of every load is used to send those licenses over the wire.

Posted
Joey short
untitled

It's been illuminating to watch 's web interface bit rot over the past couple of years. Until last month it rotted away entirely, with the device serving up only a logo.

Amusingly some users were able to restore the old web interface, which still works because the underlying data is still being provided (and will be probably since the phone app uses the same data source).
github.com/iam-TJ/open-dishy/

Now when I go to dishy.starlink.com, it's running on my raspberry pi.

Posted
Joey short
untitled

Voyager is a bit more V'ger from today

Posted
Joey short
untitled

Literally had to go find a blurb that was like "Die Hard meets The Martian--with a dash of Knives Out" to purge that previous blurb from my mind.

Posted
Joey short
untitled

"highly commericial" wtf?

Posted
Joey short
untitled

editing a pdf form in firefox, what is this dark magic?

Posted
Anna (Anna and Mark: Waldeneffect)
Incubation handbook now in print!

Incubation Handbook now in printI’m thrilled to announce that our incubation handbook is now in print! A very skilled intern helped buff up the text over the winter and the result is ready to make your next hatch a major success.

(The ebook is a bit spiffier also, with the same information but more polish and a fancier cover and title.)

Here are some of the reviews of the first edition if you need more incentive to check it out:

“I have had problems with incubating chicks, getting low to no hatch, and high hatch mortality. All of the info in this book makes great sense! This helped me a lot to fix ALL of my hatch problems.” — sunnyweller

“I especially found the “helping chicks hatch” section very helpful. Followed the instructions and saved two chicks!” — Keaokun

“My first attempt at incubating was a dismal failure. I only hatched 6 of 19 eggs. Two of those had facial and beak deformities. This little ebook was so helpful and I was able to pinpoint – many – things I had done wrong.” — V. Schafer

“Awesome book, well written. Not too basic nor too much extraneous detail.” — chem girl

I’m hoping to enjoy another round of intern magic this summer, so I’d love to hear which ebook-only title you’d most like to have available in print. Or perhaps you’d prefer us to turn our newest video course into an ebook and paperback? Please comment and let me know what you want!

The post Incubation handbook now in print! first appeared on WetKnee Books.

Posted
Joey short
untitled

copyright question it seems worth pondering:

If I use a false persona to get malicious code into an open source project, and along the way include some good code to cover my tracks, and I mendaciously comply with all the standard stuff needed to get my code into the project (copyright statements etc), then is that good code actually freely licensed?

Posted
Joey short
untitled

mostly finished rebootstrapping from source after post-con crud

Posted
Anna (Anna and Mark: Wetknee)
Inoculating mushroom logs with sawdust spawn

We’ve written in the past about our mushroom experiments, which mostly centered around using plug spawn in logs. So I was thrilled when our local library offered an opportunity to try something a little different — sawdust spawn.

(Yes, we do have the best library around. Yes, they did let us take home an inoculated shiitake log of our very own.)

 

Pros and cons of sawdust spawn

Newly inoculated mushroom log

Sawdust inoculation tool

As best I can tell, the only real downside of using sawdust spawn is that you need to buy an inoculation tool. At $45 per tool, that means sawdust spawn makes the most sense for folks who intend to inoculate at least 36 logs (although you don’t have to do them all at once, of course). My math in today’s dollars:

  • Sawdust spawn: about $1 per log in spawn cost
  • Plug spawn: about $2.25 per log in spawn cost

In addition to long-term price savings, other benefits of using sawdust spawn include:

  • Your logs will produce mushrooms faster (in 5 to 12 months instead of 9 to 18 months).
  • I actually found inoculation with the sawdust tool gentler on my wrists (no hammering!).

 

Other inoculation innovations

Measuring mushroom log hole locations

Other than the inoculation tool, using sawdust spawn is pretty much the same as using plug spawn. But I thought you might enjoy seeing our teachers’ entire process since it is definitely better than ours!

First the infrastructure: They built tables with little wooden cradles at intervals to hold the logs in place. That means the only time you really need a second set of hands is when drilling the holes.

Also note the measuring stick with the spacing information on it. No laborious hand-measuring each log!

Drilling holes in a mushroom log with an angle grinder

Another innovation is the use of an angle grinder rather than a drill gun. Mark shared a video in which you can see how much faster this is than what we’d done in the past.

(Do be careful though. I could see someone drilling through their hand with this setup.)

Shiitake sawdust spawn

After the holes are drilled, it’s time to insert the spawn. Sawdust spawn comes in a block like the one shown above. You break it up with your hands then scoop some of the loose sawdust out into an empty yogurt container (or something similar).

Inoculating a mushroom log

Waxing a mushroom logNext, bang the inoculation tool into the container a few times to fill it with spawn. Place the tool over the hole and depress the button at the top to insert spawn. The goal is for the spawn to fill the hole up to about the bark level.

After that, all you need to do is wax over each spawn-filled hole. In the past, we’ve used beeswax from local hives, but apparently any food-safe wax works. Our teachers were using paraffin, melted then daubed on with cute little brushes. But they mentioned that there’s a new kind of wax, primarily used with plug spawn, that you can wipe on cold with your finger.

After that, it’s the usual waiting game (with the side note that, since we now live in an area with less extreme precipitation than we used to be located, we need to remember to water our log if we don’t get at least an inch of rain per week).

We haven’t had productive mushroom logs since moving to Ohio, but remembering how fun and easy inoculation was put the process back on my radar. Maybe next year we’ll push wildcrafting mushrooms onto the back burner and inoculate more logs.

 

About our teachers

Soulshine Acres mushrooms

I want to end with a huge thank you to Soulshine Acres for sharing their expertise with us. They’re a frequent vendor at the Athens, Ohio, farmer’s market if you want to check some of their mushrooms out. Or just follow them on instagram using the link above to learn about their forest farm, full of over 400 mushroom logs.

The post Inoculating mushroom logs with sawdust spawn first appeared on WetKnee Books.

Posted
Joey short
untitled

it was also serving the front page as a 404 for the javascript linked from the front page yesterday, which is a very nice level of breakage indeed

Posted
Joey short
untitled

gotta give praise where due, by removing the proprietary web frontend from their starlink terminal, they drive free software development in the space of seeing basic obstruction maps, knowing when your starlink is obstructed or the network is otherwise down, etc

Making even 404 pages the same useless logo as the front page is also a strong choice.

github.com/sparky8512/starlink

Posted
Joey short
untitled

appimage mounts a clipboard, wtf?

Posted
Joey short
untitled

generation of the video archive has started, and since we're using a repository it's a collaborative public process which will culminate in a redundantly mirrored archive with rich metadata.

Here the day long youtube videos are being cut into clips github.com/distribits/distribi

I woke up refreshed home at last, ran a git-annex get, checked out the clips branch, ran the cut command, and have every talk available to review.

Posted
Joey short
untitled

Last sight of Dusseldorf. Great town!

Posted
Joey short
untitled

Performed a ceremonial tagging of Datalad 1.0 at the conclusion of

Posted
Joey short
untitled

"an octopus merge of 40 thousand branches" -- people are wild

Posted
Joey short
untitled

Streetcar I caught to the conference this morning.

Posted
Joey short
untitled

The NYT today demonstrates they can't comprehend an xkcd cartoon.

Not that I didn't already understand that about their tech reporting.

Posted
Joey short
untitled

Slides depicting a massive ecosystem with somehow central to it is a new thing I'm collecting. Scientists produce great slides like this. (And other great things.)

Posted
Joey short
untitled

When you write a software to manage your cat photos and it gets used for brain slicing scans to the tune of 2 petabytes brain/year.

Posted
Joey short
untitled

looking forward to some strolls along the Rhine now that it's finally stopped torrentially raining

Posted
Joey short
untitled

new "Plans" section on tukaani.org/xz-backdoor/

"I plan to write an article how the backdoor got into the releases and what can be learned from this. I’m still studying the details.

xz.git needs to be gotten to a state where I’m happy to say I fully approve its contents. It’s possible that the recent commits in master will be rebased to purge the malicious files from the Git history so that people don’t download them in any form when they clone the repo. [...]"

Posted
Joey
reflections on distrusting xz

Was the ssh backdoor the only goal that "Jia Tan" was pursuing with their multi-year operation against xz?

I doubt it, and if not, then every fix so far has been incomplete, because everything is still running code written by that entity.

If we assume that they had a multilayered plan, that their every action was calculated and malicious, then we have to think about the full threat surface of using xz. This quickly gets into nightmare scenarios of the "trusting trust" variety.

What if xz contains a hidden buffer overflow or other vulnerability, that can be exploited by the xz file it's decompressing? This would let the attacker target other packages, as needed.

Let's say they want to target gcc. Well, gcc contains a lot of documentation, which includes png images. So they spend a while getting accepted as a documentation contributor on that project, and get added to it a png file that is specially constructed, it has additional binary data appended that exploits the buffer overflow. And instructs xz to modify the source code that comes later when decompressing gcc.tar.xz.

More likely, they wouldn't bother with an actual trusting trust attack on gcc, which would be a lot of work to get right. One problem with the ssh backdoor is that well, not all servers on the internet run ssh. (Or systemd.) So webservers seem a likely target of this kind of second stage attack. Apache's docs include png files, nginx does not, but there's always scope to add improved documentation to a project.

When would such a vulnerability have been introduced? In February, "Jia Tan" wrote a new decoder for xz. This added 1000+ lines of new C code across several commits. So much code and in just the right place to insert something like this. And why take on such a significant project just two months before inserting the ssh backdoor? "Jia Tan" was already fully accepted as maintainer, and doing lots of other work, it doesn't seem to me that they needed to start this rewrite as part of their cover.

They were working closely with xz's author Lasse Collin in this, by indications exchanging patches offlist as they developed it. So Lasse Collin's commits in this time period are also worth scrutiny, because they could have been influenced by "Jia Tan". One that caught my eye comes immediately afterwards: "prepares the code for alternative C versions and inline assembly" Multiple versions and assembly mean even more places to hide such a security hole.

I stress that I have not found such a security hole, I'm only considering what the worst case possibilities are. I think we need to fully consider them in order to decide how to fully wrap up this mess.

Whether such stealthy security holes have been introduced into xz by "Jia Tan" or not, there are definitely indications that the ssh backdoor was not the end of what they had planned.

For one thing, the "test file" based system they introduced was extensible. They could have been planning to add more test files later, that backdoored xz in further ways.

And then there's the matter of the disabling of the Landlock sandbox. This was not necessary for the ssh backdoor, because the sandbox is only used by the xz command, not by liblzma. So why did they potentially tip their hand by adding that rogue "." that disables the sandbox?

A sandbox would not prevent the kind of attack I discuss above, where xz is just modifying code that it decompresses. Disabling the sandbox suggests that they were going to make xz run arbitrary code, that perhaps wrote to files it shouldn't be touching, to install a backdoor in the system.

Both deb and rpm use xz compression, and with the sandbox disabled, whether they link with liblzma or run the xz command, a backdoored xz can write to any file on the system while dpkg or rpm is running and noone is likely to notice, because that's the kind of thing a package manager does.

My impression is that all of this was well planned and they were in it for the long haul. They had no reason to stop with backdooring ssh, except for the risk of additional exposure. But they decided to take that risk, with the sandbox disabling. So they planned to do more, and every commit by "Jia Tan", and really every commit that they could have influenced needs to be distrusted.

This is why I've suggested to Debian that they revert to an earlier version of xz. That would be my advice to anyone distributing xz.

I do have a xz-unscathed fork which I've carefully constructed to avoid all "Jia Tan" involved commits. It feels good to not need to worry about dpkg and tar. I only plan to maintain this fork minimally, eg security fixes. Hopefully Lasse Collin will consider these possibilities and address them in his response to the attack.

Posted
Joey short
untitled

in a cafe in germany, wide awake, 16 hours of sleep seems to have beaten jetlag and accumulated xz sleep debt

I can't wait to learn about how a lot of people are using tomorrow at the Distribits conference!

Posted
Joey short
untitled

arrived in Dusseldorf for

ah europe, been too long.. also this is very very europe

Posted
Joey short
untitled

"Selfies please" - gate agent re facial recognition. 2024

Posted
Joey short
untitled

special shout out to whoever in the reversing channel is using alias "Jia Tan" ;-)

Posted
Joey short
untitled

closing all my social media before I go thru TSA security because it looks like Mr Robot was here

Posted
Joey short
untitled

my fun little surprise today was noticing liblzma in `ldd git-annex`

Pulled in via libmagic, which on Debian is patched to link to liblzma.

git-annex can be built without that (-f-MagicMime) but it does add a nice feature.

Anyway, interesting to know that Jia Tan's code is running in my processes forever unless xz gets reverted to the 2021 version.

Posted
Joey short
untitled

To find these, used:

git log --pretty=raw | perl -e 'while (<>) { if (/^commit /) { $ps=$s;$s=$_ }; if (/^author .* (\d+) [-+]\d+$/) { $pa=$a; $a=$_; $pad=$ad; $ad=$1; } if (/^committer .* (\d+) [-+]\d+$/) { $pc=$c; $c=$_; $pcd=$cd; $cd=$1; if (defined $pcd && defined $pad && $pcd==$cd && $pad==$ad) { if ($la ne $a && $lc ne $c) { print "\n" } ; $la = $a; $lc = $c; if (! defined $ls || $ls ne $ps) { print "$ps$pa$pc"; $ls=$ps}; print "$s$a$c"; } } }'

urk old habits die hard

Posted
Joey short
untitled

Checked all xz commit timestamps for similar patterns. first is a series of commits by Jia Tan on Jan 19, then another Jan 22, then Lasse has a series on Feb 9, then a long series that includes the commits mentioned above, then 3 more series by Lasse on Feb 17 and Feb 29. This certainly seems unusual.

but, I do find similar things in git.git history, Junio has a workflow that results in that legitimately

This suggests to me that xz's git workflow changed in January.

Posted
Joey short
untitled

the code changes in these commits are extensive and frightening given Jia Tan's involvement imho. Full new decoder being added with plans for assembly optimisations.

Posted
Joey short
untitled

a rebase would explain the common commit timestamps, but it preserves author timestamp

this seems a little suspicious, but maybe there is some other workflow that explains it

Posted
Joey short
untitled

anyone know of a common workflow that would result in 4 commits with 2 separate authors all having one timestamp as a common commit timestamp and a second timestamp as a common author timestamp?

Posted
Joey short
untitled

and apparently modifying its breakpoint detector alerts some other part of it and it changes behavior (according to discussion in a matrix channel)

Posted
Joey short
untitled

Or the easy way: Just push some plausible looking files to a extra branch that nobody looks at.

Posted
Joey short
untitled

There's been some exploration and possibly locking down of such binary data as a way to guard against some SHA1 collision attacks, it's been a while since I dug into it.

Posted
Joey short
untitled

While has people talking about issues with binary test files etc in source repos, and issues with using tarballs that can vary from git, doing a `git clone` and building in there is *also* exposed to a huge amount of binary data.

Including binary data hidden inside commit objects, for example. Also git blobs are zlib compressed so might be possible to smuggle in extra binary data at the end. Possibly also at the end of tree objects, I don't remember if git checks for that.

Posted
Joey short
untitled

Worth noting that some Jia Tan commits to were made with the github web interface. You can tell because they are signed by a gpg key github uses for web edits (4AEE18F83AFDEB23).

The most recent one is 62efd48a825e8f439e84c85e165d8774ddc68fd2.

So if keeps logs since January, they might have IP address information or other info.

Posted
Joey short
untitled

Well, dpkg built not linked to a lzma library at all, instead it's using the xz command. Which is good enough for now, it seems to work and I also downgraded xz to pre Jia Tan.

Posted
Joey short
untitled

Of course I installed my hacked up dpkg. Seems to work anyway.

Posted
Joey short
untitled

install dpkg that I just hacked up to use a different library on my running system while running on 5 hours of sleep YN?

Posted
Joey short
untitled

this is all pre Jia Tan code

Posted
Joey short
untitled

Have this in a debian package now. Next I'll build dpkg against it. Whee.

joey@darkstar:~/tmp>ldd /usr/bin/xz
linux-vdso.so.1 (0x00007ffc88b86000)
liblzmaunscathed.so.5 => /lib/x86_64-linux-gnu/liblzmaunscathed.so.5 (0x00007f8424805000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f8424623000)
/lib64/ld-linux-x86-64.so.2 (0x00007f842486d000)
joey@darkstar:~/tmp>xz --version
xz (XZ Utils) 5.3.2alpha
liblzma 5.3.2alpha

Posted
Joey short
untitled

what a day to get up at 5 am for the third day in a row

Now I have to do it tomorrow and the next day, or jet lag will murder me on Tuesday.

Posted
Joey short
untitled

finding myself hacking on a fork of

Posted
Joey short
untitled

According to this the backdoor had additional build code that was not gated behind checks for a debian or rpm package build. Although it's not been gotten to do anything yet, the presence of that code suggests other distributions may have also been targeted.

openwall.com/lists/oss-securit

Posted
Joey short
untitled

Today is a really good time to start gpg signing every git commit you make.

Especially if you're using infrastructure with on it that could still contain unknown backdoors.

I have signed all my commits since 2016.

git config commit.gpgSign 1

Posted
Joey short
untitled

doing some debian development this morning, of all things

(backporting dpkg to work with a sufficiently old that there is no possible of other backdoors in it

Posted
Joey short
untitled

one thing I'm sure about "Jia Tan" is that they had extensive prior experience with open source development. Everything they write in commits is pitch-perfect. This is not their first rodeo.

Kind of makes you wonder what projects they contributed to while learning all that and under what names.

Posted
Joey short
untitled

Or it could have been done to add cover for the actual backdoor insertation.

But "Additionally, the file contains random bytes to help test unforeseen corner cases." in Jia Tan's original commit seems pretty suspicious

Also on closer look, no need for this to be a RISC-V version of the exploit, the file could have ended up used on any system.

Posted
Joey short
untitled

Noticed that Jian Tan modified several additional test files besides those used in the known backdoor.

This was at the same time as the known backdoored test files, so almost certainly these RISC-V test files also contain a version of the backdoor. Used where I wonder?

Posted
Joey short
untitled

what a day to get up at 5am for the second day in a row

9 hours sleep over 2 days and I'm trying to understand a state sponsored backdoor attack in detail

Posted
Joey short
untitled

I don't have all the issues and PRs unfortunately. I was just looking at the PR that added loongson support. Seems to have come from a legitimate person, he had academic publications.

Posted
Joey short
untitled

Github has disabled the github.com/tukaani-project/xz repository

That seems a bit of a problem for everyone who needs to understand the past activity there in order to fully address the backdoor. Sheesh

I have a clone from today if anyone needs it.

Posted
Joey short
untitled

fwiw, users running testing/unstable who upgrade to fix the security hole, sshd does not appear to be restarted by the upgrade so you'll probably want to do that manually

Posted
Joey short
untitled

Debian is considering such a reversion here. I'm glad they're taking the possibility of further backdooring seriously.

(It's not quite as easy to revert as I'd thought it would be.)

bugs.debian.org/1068024

Posted
Joey short
untitled

I count a minimum of 750 commits or contributions to xz by Jia Tan, who backdoored it.

This includes all 700 commits made after they merged a pull request in Jan 7 2023, at which point they appear to have already had direct push access, which would have also let them push commits with forged authors.
Probably a number of other commits before that point as well.

Distributions are reverting the identified backdoor. This is insufficient given this volume of activity. Revert to before any of this

Posted
Joey short
untitled

would probably be worth the time for someone in -devel to look at pristine-xz delta files archive-wide, to see if there are any unusually large ones that might hide such payloads

Posted
Joey short
untitled

Of course this is still possibly in there...

Posted
Joey short
untitled

Kind of glad that ssh access was a nice juicy target for the backdoored xz. Imagine if it had lurked until unpacking tar.xz sources and then ran arbitrary payloads embedded in the xz files. Could have allowed targeted ongoing exploitation of builds.

Posted
Joey short
untitled

tired: tea.xyz encouraging people to post spam documentation patches to free software projects

wired: spamming projects with spam documentation patches to build up enough cred to take over and backdoor xz

Posted
Joey short
untitled

I rag on github a whole lot, but this is one feature it has that I really like.

Since JiaT75 backdoored xz-utils, I have blocked him and now get to see a warning in every project he touched.

I hope wasmtime et all are doing some careful review..

Posted
Joey short
untitled

Err, this was UPS actually. I'm so Fedex burnt that I crosswired the two.

Posted
Joey short
untitled

Fedex today: "Your delivery has been rescheduled for Monday. Your package is out for delivery today. Log in before April 15th or your My Fedex account will be deleted. That is not the right My Fedex password."

(Yes it is lol I can actually retain passwords unlike you.)

Posted
Joey short
untitled

up at 5 am second day in a row, I guess I'm switching to European time early before my trip on Monday

Posted
Joey short
untitled

cursed TV screenshot


remembered that TV still exists so yes, I am watching QVC and EXPTV at the same time.

Posted
Joey
the vulture in the coal mine

Turns out that VPS provider Vultr's terms of service were quietly changed some time ago to give them a "perpetual, irrevocable" license to use content hosted there in any way, including modifying it and commercializing it "for purposes of providing the Services to you."

This is very similar to changes that Github made to their TOS in 2017. Since then, Github has been rebranded as "The world’s leading AI-powered developer platform". The language in their TOS now clearly lets them use content stored in Github for training AI. (Probably this is their second line of defense if the current attempt to legitimise copyright laundering via generative AI fails.)

Vultr is currently in damage control mode, accusing their concerned customers of spreading "conspiracy theories" (-- founder David Aninowsky) and updating the TOS to remove some of the problem language. Although it still allows them to "make derivative works", so could still allow their AI division to scrape VPS images for training data.

Vultr claims this was the legalese version of technical debt, that it only ever applied to posts in a forum (not supported by the actual TOS language) and basically that they and their lawyers are incompetant but not malicious.

Maybe they are indeed incompetant. But even if I give them the benefit of the doubt, I expect that many other VPS providers, especially ones targeting non-corporate customers, are watching this closely. If Vultr is not significantly harmed by customers jumping ship, if the latest TOS change is accepted as good enough, then other VPS providers will know that they can try this TOS trick too. If Vultr's AI division does well, others will wonder to what extent it is due to having all this juicy training data.

For small self-hosters, this seems like a good time to make sure you're using a VPS provider you can actually trust to not be eyeing your disk image and salivating at the thought of stripmining it for decades of emails. Probably also worth thinking about moving to bare metal hardware, perhaps hosted at home.

I wonder if this will finally make it worthwhile to mess around with VPS TPMs?

Posted
Joey short
untitled

Oh, the updated TOS still allows them to "make derivative works", probably still allows AI training.

Posted
Joey short
untitled

Look at it this way... the bit about them getting a license to any content in the Service is in the same paragraph where it talks about how you may not host illegal content on the Service. If this only applied to forum posts somehow (which is does not, "the Service" is clearly defined at the top as every part of Vultr), then they would not be prohibiting illegal content being stored in a VPS.

Posted

List of feeds:

  • Anna and Mark: Waldeneffect: last checked (4610 posts)
  • Anna and Mark: Wetknee: last checked (41 posts)
  • Joey: last checked (224 posts)
  • Joey devblog: last checked (270 posts)
  • Joey short: last checked (909 posts)
  • Jay: last checked (50 posts)
  • Errol: last checked (53 posts)
  • Maggie: last checked (8 posts)
  • Tomoko: last checked (77 posts)
  • Jerry: last checked (28 posts)
  • Dani: last checked (23 posts)