Ultraviolet: I'm sorry to hear that. The only thing that worries me is whether the slowing down is related to my code (for example, holding too many artists names meant to be updated in memory), or if last.fm has some sort of "unfair usage" detection that identifies a lot of web service calls from your machine and delays the answers.
Obviously, there is some sort of upper limit given that last.fm allows 5 calls per second. If your library is humongous and you want it kept up-to-date, there is some point where your machine can ask last.fm for updates 24/7 and still not get as much as needed within that time frame.
But you've effectively made me aware that I haven't really written my code to handle this gracefully. Currently, it's just "get all artists names lacking last.fm metadata, put them into memory, iterate over all of them, ask last.fm for metadata, batch insert into database import table for every 1000 answers, when all is done, run the script that updates the real database tables from import table". This is fairly efficient for a library of my size but obviously not of yours.
I'll change this to keep less artist names in memory during the scanning, and to update the real database tables during the process. And also to calculate some upper limit for what's realistic to handle given the 5 calls per second limit, and then handle larger libraries differently than trying and failing.
I cannot give you a time frame of when this will happen. I'll finish a new feature that I'm halfway through and then I'll prioritize this, but I'll have significantly less spare time for the next weeks to come. So you can either wait for that, or try adding your music in batches (start off by having just one media folder containing 50.000 tracks, update search index, add another media folder with 50.000 more tracks, update search index, repeat until done).