/t/ - Technology

Discussion of Technology

Index Catalog Archive Bottom Refresh
+
-
Options
Subject
Message

Max message length: 12000

files

Max file size: 32.00 MB

Total max file size: 50.00 MB

Max files: 5

Supported file types: GIF, JPG, PNG, WebM, OGG, and more

CAPTCHA
E-mail
Password

(used to delete files and posts)

Misc

Remember to follow the Rules

The backup domains are located at 8chan.se and 8chan.cc. TOR access can be found here, or you can access the TOR portal from the clearnet at Redchannit 3.0 (Temporarily Dead).

Christmas Collaboration Event
Volunteers and Ideas Needed!


8chan.moe is a hobby project with no affiliation whatsoever to the administration of any other "8chan" site, past or present.

You may also be interested in: AI

(4.11 KB 300x100 simplebanner.png)

Hydrus Network #12 Anonymous Board volunteer 09/24/2025 (Wed) 18:29:57 No. 24277
This is a thread for releases, bug reports, and other discussion for the hydrus network software. The hydrus network client is a file-management application written for Anon and other internet-fluent media nerds who have large file collections. It browses with tags instead of folders, a little like a booru on your desktop. If they wish, users can easily share tags anonymously through a public server. Everything is free, no ads, and privacy is the first concern. If you have 10,000+ files and cannot find anything, hydrus might help! I am the hydrus developer. I am continually working on the software and try to put out a new release every Wednesday by 8pm EST. Past hydrus imageboard discussion, and these generals as they hit the post limit, are being archived at >>>/hydrus/ . Hydrus is powerful, complicated, and messy. It is not for everyone. If you would like to learn more, please check out the extensive help and getting started guide here: https://hydrusnetwork.github.io/hydrus/
>>24785 + >>24781 >Check options->downloading. You can have it run random, which I generally recommend if you don't mind, since that smooths out a bunch of bandwidth and checker timing bumps, or alphabetical. Thanks dev. I'm just using this to make my artist searches run before concept searches, it helps keep the import times in order when an artist releases a comic/series where only some files match a concept search. This option to run alphabetically and add a special leading character to the subscriptions I want running first/last is a good solution.
>>24713 >648 Hydrus Companion for Chromium worked for me in, I think, November, but now it says something like "Network error (Hydrus not running?)"
>>24783 Neither gallery-dl nor hydownloader work for my case because gallery-dl does not support the site and the very specific use case I have for it, and hydownloader fundamentally relies on gallery-dl. That I know, I'd have to compile my own version of gallery-dl with my own extractor just to save maybe a minute or two while bulk-downloading images. It'd probably be simpler to make the changes I want to Hydrus itself than to go through an extended dependency rigamarole, but again, spending hours coding just to save a few minutes of manually regexing and pasting image links for myself and only myself is ridiculous. Thanks, I'll wait for the eventual API parsing reconstruction.
While most new releases do seem to prefer .cbz, I still have a collection of old .cbr files. I could repack them, but... I was hoping adding rar support wouldn't be too much work if it could use the same algorithms. Calibre, also a python program, uses the python library unrardll for .cbr support. I'm really hoping to leave Calibre behind asap because it eagerly makes changes to imported files! Bleh!
I had a great week finishing up the duplicates database tech I had been planning. The duplicates and duplicates auto-resolution systems are much faster when reacting to file status changes, and a number of annoyances are ironed out. I also did a scatter of small misc work, but it is mostly duplicates stuff this week. The release should be as normal tomorrow.
>zooming in on pictures in preview looks like this is this user error or a deliberate choice?
>>24815 >zooming I don't follow. I don't see anything abnormal.
>>24816 the overlayed qView window is the same image also with zoom, but the hydrus media viewer specifically has extreme artifacting. fine for tumbnails and maybe the preview window but seems completely unacceptable for even basic browsing of especially pixel art when its the full media viewer.
I'm trying to import a file with a json sidecar but I get this error when clicking the "edit parsing formula" button: v649, linux, source TypeError unhashable type: 'list' File "/opt/hydrus/hydrus/client/gui/widgets/ClientGUICommon.py", line 357, in EventButton self._func( *self._args, **self._kwargs ) ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/hydrus/hydrus/client/gui/metadata/ClientGUIMetadataMigrationImporters.py", line 242, in _EditJSONParsingFormula panel = ClientGUIParsingFormulae.EditJSONFormulaPanel( dlg, collapse_newlines, self._json_parsing_formula, test_data ) File "/opt/hydrus/hydrus/client/gui/parsing/ClientGUIParsingFormulae.py", line 1064, in __init__ self._parse_rules.SetData( parse_rules ) ~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ File "/opt/hydrus/hydrus/client/gui/lists/ClientGUIListBoxes.py", line 1424, in SetData self._listbox.SelectData( selected_datas ) ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^ File "/opt/hydrus/hydrus/client/gui/lists/ClientGUIListBoxes.py", line 299, in SelectData list_widget_item.setSelected( data in datas ) ^^^^^^^^^^^^^
>>24818 Thank you--another guy reported this exact same bug to me earlier this week and I just couldn't figure it out. His situation was after creating a new content parser, switching its formula to JSON, and then trying to edit. It all works fine for me and I can't figure out where this spare list is coming from. Today's v650 will skip over the error, but I expect it'll now cause a new one further into the panel since it sounds like we have some weird data getting in somehow. Can you click the 'export' button on that edit formula panel that fails and then paste the JSON here? For instance, when I was doing his work with an empty formula, I exported this: [31, 4, [[[0, [51, 1, [0, "posts", null, null, "posts"]]]], 0, "", [84, 1, [26, 3, []]]]]
>>24817 You're talking about the lanczos interpolation scaler. It often creates these ugly ring artifacts. In options --> media playback, you can switch the media viewer to use bilinear or nearest neighbor instead. For pixel art you want nearest neighbor. Side note, but I wish Hydrus had more options for scalers. I don't like Lanczos and bilinear is blurry, but I wish it had something better than just nearest neighbor. the ewa_lanczossharp filter from MPV is pretty good, and it'd be cool if Hydrus supported that.
>>24810 unrar is a bitch to work with. It uses COM files. I'm working on a new comic reader, and actually got Ai (Grok) to help me with it. My program was able to read rar files in the end, but now I find it can't read RAR5 files. So, another step to be dealt with. Unrar is just really hard to work with.
>>24810 If you're looking for a comic reader that can read .rar files, I suggest Comic Rack. I'm modeling my new program after it, because it can't do the one thing that I really want. And that is have unlimited stacks. Cyolito once told me he couldn't do it, because he had programmed himself into a corner. So, I'm attempting to get at least a basic viewer that can do that.
(50.69 KB 600x600 barf red.jpg)

>>24823 >and actually got Ai (Grok) to help me with it
https://www.youtube.com/watch?v=vnIxlW2lwgg windows zip: https://github.com/hydrusnetwork/hydrus/releases/download/v650/Hydrus.Network.650.-.Windows.-.Extract.only.zip exe: https://github.com/hydrusnetwork/hydrus/releases/download/v650/Hydrus.Network.650.-.Windows.-.Installer.exe linux tar.zst: https://github.com/hydrusnetwork/hydrus/releases/download/v650/Hydrus.Network.650.-.Linux.-.Executable.tar.zst I had a great week finishing the last of my duplicate database tech updates. Full changelog: https://hydrusnetwork.github.io/hydrus/changelog.html search undo I forgot to mention last week that the user who sent in all the UI updates also started an undo system for adding/removing predicates in normal search pages. If you remove a complicated predicate by accident, try hitting the 'undo' menu up top--you should be able to recover it now! duplicates I hashed out a ton of clever database updates this week for duplicates. The tl;dr: is that duplicates and duplicates auto-resolution now count faster and stop doing some annoying stuff. I am now done with pretty much everything I wanted to do here, so I am going to polish the help and finally launch this system for all users properly next week. If you have been waiting to try this out when it was nice to use, please watch this space. If you have a bunch of big auto-resolution rules, they are going to do a bit of work on update this week. Shouldn't take more than a few seconds for each rule. Specifically for the work, this week I made all the incremental search/count stuff update smartly on any duplicate pair change instead of just resetting to be regenerated from scratch. So, when you have a duplicate pair panel that says '10,000 pairs searched; 523 match', resolving a duplicate or having a file import now just quickly adjusts the count by a tiny amount rather than loading up the whole thing and counting again. You can now watch the duplicates page while plenty of imports are going on and it is no longer a refresh-fest. The 'initialising' step you will have seen a bunch before now also lags the first time you do a search. Furthermore, the duplicates system now does these updates when files move between local file domains or into/out of the trash. So, moving files around in complicated search domains, or simply just deleting a file, will now trigger an instant quick count update where it matters. And, on top of that, auto-resolution rules are now plugged into this local file domain tech. An auto-resolution rule is now careful to no longer track pairs where one file leaves its search domain (e.g. entering the trash), and it does the same incremental count update. A bunch of 'why is this pair still in this auto-resolution queue/count even though it is deleted' stuff is now fixed. I also re-did some refresh stuff in the 'review pending actions' panel, so that any time you close the duplicate filter on that guy, it will now refresh the queue properly. next week Final polish and a help pass for duplicates auto-resolution, so I can launch it properly.
>>24826 >I forgot to mention last week that the user who sent in all the UI updates also started an undo system for adding/removing predicates in normal search pages. If you remove a complicated predicate by accident, try hitting the 'undo' menu up top--you should be able to recover it now! Super fucking cool.
>>24825 Lol! Grok ( and ChatGPT) are actually really good at programming, especially when it comes to C++ and Qt. You just have to keep talking to the Ai to refine the code that has been put out, pointing out the changes that you want. I could have never written what I have so far without it. Especially the COM file part for unrar.dll.
Can someone please post their http header config for hydrus? The entry under network>data> manage http headers> gelbooru header popup I found them in firefox. Pretty sure "fringeBenefits=yup" is the header I need, but it's not working no matter how I try to format the entry in hydrus. >>24828 That's cool but I always fear the idiocracy future where AI enabled jeets have no idea what they are working on and all the highly skilled professionals (like dev-sama) are pushed out of the market lol.
>>24829 I don't see a header for Gelbooru, and I do download from them. I only have headers for Sankaku and Pixiv. So glad I didn't pursue my CS degree beyond Associate. Went with Medical Technology, but ai took them both over in the end. I can't see a future with human programmers. Ai will be FAR beyond them in knowledge.
>>24829 >>24832 I worded that poorly. Can someone post their gelbooru http header config?
>>24829 >>24833 I have my benefits in the cookies. I checked my HTTP headers, nothing for gelbooru.
(1.50 MB 1024x768 import_with_sidecar.webm)

>>24818 >>24820 I tried with an empty DB in v649, and it also gives the error. Then trying v650, I can now open the formula editor. empty parser exports as: [109, 3, [[26, 3, [[2, [114, 3, [[84, 1, [26, 3, []]], true, "", [55, 2, [[], "my_image.jpg.json"]], [31, 4, [[], 0, "", [84, 1, [26, 3, []]]]]]]]]], [84, 1, [26, 3, [[2, [99, 1, [2, true, null]]]]]], [115, 1, "6c6f63616c2074616773"]]] I tried two different jsons, but I don't think it's detecting the raw data properly. Instead of the expected object I've seen it empty (vid) and with '0' as 'raw data', nothing like the files I put in. If I give it a malformed json, the 'raw data' is a JSONDecode Error, but it does display the expected json after the error text...
>>24826 So I updated from 647 to 650, but the upgrading screen was taking longer than it should have, I SIGTERM'd hydrus and restarted it and it seems fine, there's nothing wrong in the logs. Anything I should look at to make sure I didn't subtly break something?
Are there any working nitter and/or bluesky downloaders? The nitter one in the cuddlebear repo seems to silently block Hydrus, and trying a few other instances they either cuckflare block or think Hydrus is an AI scraper and block it. Or maybe someone here has a gallery-dl config that can make a json sidecar with all the metadata Hydrus supports, that would be great.
Is AI tagging viable yet?
>>24834 You're able to download all site content? I'll try importing cookies
>>24844 I messed with the WD14 tagger in comfyUI and did image > tagger > text to image and its pretty accurate. The tuning is in the tagger strength. Image upscale is another good use of AI, but it would obviously double your DB size.
Is there a hydrus companion alternative? It has a bad memory leak in firefox.
>>24841 Hydownloader can grab twitter & bluesky. >>24844 https://github.com/Garbevoir/wd-e621-hydrus-tagger I use this, it's ok, it's not great for weird things like 4chan memes but it's pretty good for stuff you'd find on a booru. It was able to tell pic1 was aged down Kiara but it thinks pic3 is a Pokemon creature. Still beats having no tags.
>>24849 Just a heads up, there's been a v3 and a v3 Large of WD v1.4 ViT that's been out for quite a while now. Both are better than v2 imo (especially v3 large). https://huggingface.co/SmilingWolf/wd-vit-large-tagger-v3 https://huggingface.co/SmilingWolf/wd-vit-tagger-v3 I've been using this for Automatic1111 which should auto download those newer models (and older one) for anyone that wants a GUI: https://github.com/67372a/stable-diffusion-webui-wd14-tagger
why has hydrus stopped downloading all images from a blueksy link? it only downloads the first one for me now
>>24850 Nice, what specs do I need for the large? I run AMD so I don't get CUDA acceleration, if that matters.
>>24852 Not sure tbh, but it should not be very demanding. I've got a 12GB RTX 3060 which is overkill for such small models. v2 and v3 are both about the same size at 378MB, so you should have no issues running standard v3 at about the same speed since you're already running v2. v3 Large is a little over 3x the size at 1.26GB. That's still quite small as far as models go. It will be slower to process, but the trade off for better tags is worth it. You can probably run it no problem. v3 Large took an average of 4.25 seconds per image over 2,973 images (or 3.5 hours) on my machine earlier today. Pic related.
(15.90 KB 224x932 Clipboard_12-05-2025_02.png)

>>24852 >>24853 D'oh! Meant to add this. Here's how v3 large tagged that Kiara pic. Unfortunately, I'm not aware of any models that handle IRL photos very well. They're all trained off boorus and similar sites because the images are already part of a huge reliably tagged dataset.
>>24852 >I run AMD so I don't get CUDA acceleration you technically do through rocm but it's not that great
Mr. President, a second * -> clothing:* sibling has hit the PTR.


Forms
Delete
Report
Quick Reply