

And following up on that threat was obviously going to be horrible for the people in Gaza.
But please continue patting yourself on your back…


And following up on that threat was obviously going to be horrible for the people in Gaza.
But please continue patting yourself on your back…
If a system is able to change their output or behavior to account for new information, has it not learned?
Depends on the setup and what you call learning. If you let them, bots can write down things to remember in future prompts, and edit those “memories”.


What do you think it’s meant for? It’s a perfect fit for any database that needs to uniquely identify a person.
The problem is using it as authorization without any other checks.


Good. I remember my parents boycotting South Africa over apartheid. Israel is acting much worse than they were.
What? That is true in small companies too.


Obviously wrong.


As far as I can tell, his involvement is investing in a rare earth mineral company as part of a green energy initiative. I don’t see anything connecting him to some “freedom city”. That’s Peter Thiel: https://www.insidehook.com/internet/peter-thiel-praxis-next-great-city-greenland


Lol. If that was true they would vote very differently.


Moving the goalposts is an informal fallacy in which evidence presented in response to a specific claim is dismissed and some other (often greater) evidence is demanded.


Your original argument was that this conflict was opened by kidnapping the head of a state.
Faced with a counterpoint, you’re arguing it’s not like a much more serious invasion.
True, but that’s not invalidating the fact that it was not opened by a kidnapping.


Have you heard about the expression “moving the goalposts”?
Americans keep voting for people who keep giving them worse health care and education… It’s sad, but it is what it is.


Also cementing USA’s new reputation as an unreliable business partner.


It’s been frowned upon for decades.
That leads us to our second principle of object-oriented design: Favor object composition over class inheritance


“roughly five weeks”
Since weeks are counted from the beginning of her last period, not conception, this operation must have happened within one week of her missed period. Perhaps it wasn’t so clearly within the law.


I pasted 1k line C++ file into Gemini, along with a screenshot and a trace log and asked it to find the bug. It reasoned for about 5 minutes. Extract of the solution:
There has to be an issue in this sequence for i=21:
V_to_next_xy (norm) is (5.00004113497265e-08, -0.999999999999999, 0). (Correct)
Unnormalized perpendicular P_un = Vec3(V_to_next_xy.y(), -V_to_next_xy.x(), 0.0) should be (-0.999999999999999, -5.00004113497265e-08, 0).
temp_bisector becomes P_un.
temp_bisector.normalize() is called.
The result of this normalization is logged as temp_bisector (result of collinear path): (-1.78813934326169e-07, -0.999999999999984, 0).
This step (5) is where the numerical error occurs. The vector (-0.999…, -5e-8, 0) when normalized should not become (-1.78e-7, -0.999…, 0).
The latter is almost parallel to (0, -1, 0), while the former is almost parallel to (-1, 0, 0).
It’s as if temp_bisector.normalize() for the input (-0.999…, -5e-8, 0) is producing something wildly incorrect, or the vector being normalized was actually (-1.78e-7 * L, -0.999… * L, 0) for some L.
What if temp_bisector = Vec3(V_to_next_xy.y(), -V_to_next_xy.x(), 0.0); itself has precision issues?
If V_to_next_xy.y() is -0.999999999999999 (16 nines), this is -(1.0 - 1e-15).
If V_to_next_xy.x() is 5.00004113497265e-08.
Vec3 tmp = Vec3(-(1.0-1e-15), -5.00004113497265e-08, 0.0).
This should be fine.
The most plausible candidate is Vec3::normalize() failing on this specific input vector (-0.999999999999999, -5.00004113497265e-08, 0).
Perhaps length() computes sqrt(xx + yy + zz).
xx = (-(1.0-1e-15))^2 = (1.0-1e-15)^2 = 1.0 - 2e-15 + 1e-30.
yy = (-5e-8)^2 = 2.5e-15.
xx + y*y = 1.0 - 2e-15 + 1e-30 + 2.5e-15 = 1.0 + 0.5e-15 + 1e-30.
length = sqrt(1.0 + 0.5e-15 + 1e-30). This is extremely close to 1.0.
If this length calculation has any error causing it to be, for example, abs(y) by mistake, then x/length would become x/abs(y) and y/length would become y/abs(y) = sign(y).
x/abs(y) = -1.0 / 5e-8 = -2e7. This is not what’s happening.
The logged value for temp_bisector (-1.78813934326169e-07, -0.999999999999984, 0)
is suspicious because its X component -1.78813934326169e-07 is very close to the X component of V_to_prev_xy (norm) from loop i=21, which is -5.00004113497265e-08, divided by a factor. Or perhaps V_to_next_xy.x() itself was somehow used more directly than -V_to_next_xy.x() in the Y component of the vector passed to normalize.
The logged temp_bisector (result of collinear path) for i=21 is (-1.78813934326169e-07, -0.999999999999984, 0).
Consider Loop i=20: temp_bisector (result of collinear path): (1, 0, 0). Its input for perpendicular construction was V_to_next_xy (norm) = (X_v: 4.9999318885022e-08, Y_v: -0.999999999999999, Z_v: 0).
Perpendicular from this is (Y_v, -X_v, 0) = (-0.999…, -4.999…e-8, 0). Normalizing this gives (-1, -4.999…e-8, 0).
The log for i=20 says it becomes (1,0,0). This is also wrong. It should be (-1, -very_small_y, 0).
The fallback temp_bisector = Vec3(1,0,0) only happens if temp_bisector.normalize() (the one on the perpendicular) also fails.
It seems the normalize() call for the vector constructed as Vec3(V_to_next_xy.y(), -V_to_next_xy.x(), 0.0) is problematic when one component is very small.
It correctly identified that sqrt(_v[0]*_v[0] + _v[1]*_v[1] + _v[2]*_v[2]); had too low precision and using std::hypot(_v[0], _v[1], _v[2]) would likely solve it.
If this is just autocomplete, then I agree that it’s a pretty fancy one.


The mean IQ level was more in the region with medium fluoride concentration in drinking water (56.68) compared to areas with low fluoride concentration (41.03) and high fluoride concentration (31.59).
So according to that study, having “medium levels” (1.2-2ppm) of fluoride is much better. I checked three random water reports in Florida which had 0.5, 0.7 and 0.9 ppm.


Which is why I think sensitivity is the wrong word.
Trump was not ever going to try to stop it, he supports genocide and war crimes. Remember him telling Israel to “finish the job in Gaza”?