• 0 Posts
  • 15 Comments
Joined 2 years ago
cake
Cake day: July 4th, 2023

help-circle
  • .0000000000000003 atm or 0.3 nanopascals of atmosphere.

    On Earth, this is considered to be a very good vacuum. In fact, the density of the atmosphere at the Moon’s surface is comparable to the density of some of the outermost fringes of Earth’s atmosphere, where the International Space Station orbits.

    the Moon is considered not to have an atmosphere because it cannot absorb measurable quantities of radiation, does not appear layered or self-circulating, and requires constant replenishment due to the high rate at which its gases are lost into space.

    I feel like saying the moon technically has an atmosphere is like saying an astronaut has an atmosphere if they farted in space sans spacesuit because some gas lingers around them.



  • Did you want to add anything to the discussion or just make a snarky comment? I looked through the paper linked in the article and didn’t see a capacity listed.

    Our approach directs an alternative Li2S deposition pathway to the commonly reported lateral growth and 3D thickening growth mode, ameliorating the electrode passivation. Therefore, a Li–S cell capable of charging/discharging at 5C (12 min) while maintaining excellent cycling stability (82% capacity retention) for 1000 cycles is demonstrated. Even under high S loading (8.3 mg cm–2) and low electrolyte/sulfur ratio (3.8 mL mg–1), the sulfur cathode still delivers a high areal capacity of >7 mAh cm–2 for 80 cycles.

    A 5C charging rate is great, but it’s pretty useless if the battery is too small to be practical.





  • I think the Chinese room argument published in 1980 gives a pretty convincing reason why the Turing test doesn’t demonstrate intelligence.

    The thought experiment starts by placing a computer that can perfectly converse in Chinese in one room, and a human that only knows English in another, with a door separating them. Chinese characters are written and placed on a piece of paper underneath the door, and the computer can reply fluently, slipping the reply underneath the door. The human is then given English instructions which replicate the instructions and function of the computer program to converse in Chinese. The human follows the instructions and the two rooms can perfectly communicate in Chinese, but the human still does not actually understand the characters, merely following instructions to converse. Searle states that both the computer and human are doing identical tasks, following instructions without truly understanding or “thinking”.

    Searle asserts that there is no essential difference between the roles of the computer and the human in the experiment. Each simply follows a program, step-by-step, producing behavior that makes them appear to understand. However, the human would not be able to understand the conversation. Therefore, he argues, it follows that the computer would not be able to understand the conversation either.









  • I’m not pretending to understand how homomorphic encryption works or how it fits into this system, but here’s something from the article.

    With some server optimization metadata and the help of Apple’s private nearest neighbor search (PNNS), the relevant Apple server shard receives a homomorphically-encrypted embedding from the device, and performs the aforementioned encrypted computations on that data to find a landmark match from a database and return the result to the client device without providing identifying information to Apple nor its OHTTP partner Cloudflare.

    There’s a more technical write up here. It appears the final match is happening on device, not on the server.

    The client decrypts the reply to its PNNS query, which may contain multiple candidate landmarks. A specialized, lightweight on-device reranking model then predicts the best candidate by using high-level multimodal feature descriptors, including visual similarity scores; locally stored geo-signals; popularity; and index coverage of landmarks (to debias candidate overweighting). When the model has identified the match, the photo’s local metadata is updated with the landmark label, and the user can easily find the photo when searching their device for the landmark’s name.


  • It’s not data harvesting if it works as claimed. The data is sent encrypted and not decrypted by the remote system performing the analysis.

    From the link:

    Put simply: You take a photo; your Mac or iThing locally outlines what it thinks is a landmark or place of interest in the snap; it homomorphically encrypts a representation of that portion of the image in a way that can be analyzed without being decrypted; it sends the encrypted data to a remote server to do that analysis, so that the landmark can be identified from a big database of places; and it receives the suggested location again in encrypted form that it alone can decipher.

    If it all works as claimed, and there are no side-channels or other leaks, Apple can’t see what’s in your photos, neither the image data nor the looked-up label.