With respect to 2, it would stop others scrapping the content to train more open models on. This would essentially give Reddit exclusive access to the training data.
With respect to 2, it would stop others scrapping the content to train more open models on. This would essentially give Reddit exclusive access to the training data.
I love these guys, they are wankers, but they have a point. I didn’t get a say in any if the laws the upper class subject me to. Let ‘em clog up the courts for all I care.
Bind tun0 in the settings but what I do is run BitTorrent in a docker container with WireGuard so the vpn doesn’t effect my day to day browsing
Let’s put it this way, I’d be surprised if they didn’t have a backup of each single one of your messages.
Just add 11 to utc.
No harder than having different times in different places.
So it looks like protonmail is actually legit then
Of course poor regulation can be bad, it was a silly question that was loaded. Look at, for example the 2002 tort reforms and the damage that did to public safety.
Imagine how much damage could be done to individual privacy and freedom by an ill informed legislature if they elect to regulate gradient descent.
No, they said bs is published about ai.
Unfortunately we don’t all share your health and fitness.
Or climate, many people don’t have the luxury to live in 10-25 deg range.
You want H2OGPT or just use Langchain with CLI
I just use and old laptop
If and only if the trained model is accessible without licence.
E.g. I don’t want Amazon rolling out a Ilm for $100 a month based on freely accessible tutorials written by small developers.
But yeah duck copyright
I wish there was more variety.
You basically have BSD and Linux and in the Linux space {glibc/musl systemd/openrc/runit PKGBUILD,ebuild,deb,rpm}
which seems like a lot but it’s the really niche stuff that’s fun to pull apart and play with.
Well to clarify the two big differences here are that the exe is pre compiled and maybe dynamic libraries.
Heavy tech stacks do suck though
These comments often indicate a lack of understanding about ai.
Ml algorithms have been in use for nearly 50 years. They certainly become much more common since about 2012, particularly with the development of CUDA, It’s not just some new trend or buzz word.
Rather, what we starting to see are the fruits of our labour. There are so many really hard problems that just cannot be solved with deductive reasoning.
So I tried adding SSL and I still couldn’t get Voyager to work with a self hosted instance.
If anybody has figured out how to get a self-hosted lemmy to work with an app I’d love to know how. For the moment I’m using NodeBB which is fine.
I have Caddy and Lemmy running in containers on an Alpine Box, for which the hostname is “john”, it is configured to have a static IP at the router 192.168.1.200
. Caddy is set to network_mode: "host"
.
Here were my steps:
bind-utils
for dig
/nslookup
for debuggingdnsmasq
and add address=/john/192.168.1.200
tls internal
directive to the caddyfile:http://lemmy.john {
reverse_proxy :1236
}
https://lemmys.john {
tls internal
reverse_proxy :1236
}
http
server to make the .crt
files accessible on the iOS device python3 -m http.server
lemmys
just to be sure)https://lemmys.john
in Safari and ensure it loads normally, as a typical site would.When I try to open it from the account it says "Problem connecting to lemmys.john
. Please try again”. It definitely loads fine in Safari, so there must be something particular to the app that I’m missing.
Edit: unless this is an issue with the nginx reverse proxy inside the container 🤔
Translation is very different from generation.
As a matter of fact, even AI generation has different grades of quality.
SEO garbage is certainly not the same as an article with AI generated components and very different from a translated article.
Further TL;DR
In preparation for an IPO:
Reddit: you must now only use our app to prop up our add revenue. No third party apps (unless you pay us handsomely)
Everyone: no thanks, just make our own alternative
This is the only path forward.