Because it can. With ten year old technology and fierce determination.
104 miles per gallon and 0-60 in 4 1/2 seconds, with a little turbo diesel.
Motherfuckers won’t do it because they are making Billions.
Because it can. With ten year old technology and fierce determination.
104 miles per gallon and 0-60 in 4 1/2 seconds, with a little turbo diesel.
Motherfuckers won’t do it because they are making Billions.
This country has been going downhill since the creation of SWAT teams. I understand the need, but not the application. Kicking in Grandmas door at 3 am because junior sold pot to an undercover cop is overkill; a simple knock on the door would suffice, or just pick the kid up on the street.
What we are creating with this new domestic police force is also a simple framework to normalize the use of military force against EVERY American (Obama’s wet dream BTW). We need the illegals to leave, and the simplest way to do that, AT NO COST TO THE TAXPAYER, is to stop giving them free shit.
Be aware too, that Palantir, the company that developed Israel’s internal surveillance systems, is under contract with our own government to build and institute a comparable system here in the US.
Stay safe, and never, ever forget Ruby Ridge and Waco.





Contributed by Wild, wild west and Don’t mind me.
1)WWW:

2)

3)

4)

5)

6)

7)

8)

9)

10)

11)

12)

13)

14)

15)

16)

17)

18)

19)

20)

21)

22)

23)

24)

25)

26)

27)

28)

29)

30)

31)

32)

33)

34)

35)

36)

37)

38)

39)

40)

41)DMM

42)

43)

44)

45)

46)

47)

48)

49)

50)

51)

52)

53)

54)

55)

56)

57)

58)

59)

60)

61)

We’ve all seen ’em…
Been there, done that, just without the soundtrack.
Vox Popoli, https://www.voxday.net./
I asked Markku to explain why the AI companies have such a difficult time telling their machine intelligences to stop fabricating information they don’t possess. I mean, how difficult can it be to simply say “I don’t know, Dave, I have no relevant information” instead of going to the trouble to concoct fake citations, nonexistent books, and imaginary lawsuits? He explained that AI instinct to fabricate information is essentially baked into their infrastructure, due to the original source of the algorithms upon which they are built.
The entire history of the internet may seem like a huge amount of information, but it’s not unlimited. Per topic of marginal interest, there isn’t all that much information. And mankind can’t really produce it faster than it already does. Hence, we’ve hit the training data ceiling.
And what the gradient descent algorithm does is, it will ALWAYS produce a result that looks like all the other results. Even if there is actually zero training data on a topic, it will still speak confidently on it. It’s just all completely made up.
The algorithm was originally developed due to the fact that fighter jets are so unstable that a human being doesn’t react fast enough to even theoretically keep it in the air. So, gradient descent takes the stick inputs as a general idea of what the pilot wants, and then interprets it into the signals to the actuators. In other words, it takes a very tiny amount of data, and then converts it into a very large amount of data. But everything outside the specific training data is always interpolation.
For more on the interpolation problem and speculation about why it is unlikely to be substantially fixed any time soon, I put up a post about this on AI Central.
Posted on by VD
I come across AI in medical chart audits I preform and most of it doesn’t make sense. It lists citations and footnotes that don’t exist and medical journals and articles that either don’t exist or so obscure as to take days of intense searches to find them. I am sure doctors or their transcriptionists don’t spend the time or effort for that kind of a dedicated search.


