• 0 Posts
  • 28 Comments
Joined 1 year ago
cake
Cake day: October 4th, 2023

help-circle

  • I’m very much interested in what they intend to use to shoot the drones. Missiles? Way to expensive.

    Well, if we’re talking about a policing role, it may be fine.

    In war, if Country A and Country B are arm-wrestling, and Country A can launch a drone that costs a tenth of what Country B’s missiles do, you can probably guess that Country A is going to keep sending drones, because that’s a pretty favorable exchange. Gotta worry about what happens if it scales up.

    But if we’re talking a policing role and don’t expect hundreds or thousands of drones to be sent out – like, the aim is countering espionage or sabotage – that might be okay.

    Now, granted, one possibility is that someone might try to figure out a way to send large numbers of drones to do the above, but then that starts to stand out. I think that the current situation is probably more of one where the concern is that malicious drone operators are trying to hide in the noise created by benign drone operators. We don’t easily know whether a given drone is just some random person flying a drone where it shouldn’t be, or whether it’s someone trying to gather intelligence. But if spies start launching a hundred drones at a go, it’s going to be pretty obvious that it’s not just some random person making a mistake.

    EDIT:

    Not sure the Bundeswehr got any and if not it’ll take fives years of debate if this is technology we actually need and another ten to procure the necessary equipment.

    I remember just reading about some kind of programmable-airburst SPAAG that Germany’s sending Ukraine, think it was on a Boxer chassis. Assuming that Germany isn’t sending every one of those that they have, they probably have some to stick around sensitive areas of their own.

    kagis

    https://mil.in.ua/en/news/ukraine-is-likely-to-receive-boxer-infantry-fighting-vehicles/

    The Boxer RCT30 combat module combines the unmanned turret from KNDS Germany with the proven Boxer control module from ARTEC – a joint venture between Rheinmetall and KNDS Germany. The module is armed with the MK 30-2/ABM 30×173 mm stabilized automatic cannon from Rheinmetall. It provides accurate engagement of moving targets both on the ground and in motion.

    The German army intends to purchase about 150 systems of this type, and the Netherlands – 72 systems.

    The vehicle also has a landing compartment that can accommodate up to six fully equipped infantrymen. However, as the publication notes, the name “command support vehicle” may indicate that these combat vehicles will not be used as an infantry fighting vehicle, but can be used to protect the RCH 155 self-propelled howitzers from drones.

    https://www.rheinmetall.com/Rheinmetall Group/brochure-download/Weapon-Ammmunition/B305e0424-MK30-2-ABM-automatic-cannon.pdf

    Within a range up to 3,000 metres the MK30-2/ABM delivers maximum effectiveness against land-, air- and sea targets.

    So if you plonk one of those in the middle of a military base or whatever, you’ve got a sphere of something like 3km radius.

    looks further

    It also looks like there’s some fancier thing that has both a gun and missiles.

    https://en.wikipedia.org/wiki/Skyranger_30

    The Skyranger 30 is a short range air defense turret system developed by Rheinmetall Air Defence AG (formerly Oerlikon) and first revealed in March 2021. Its role is to provide ground units with a mobile system capable of engaging fixed and rotary-wing aircraft, Group I and II unmanned aerial systems (UAS), loitering munitions and cruise missiles.[1][2]

    Assuming that the “Group I” here is the same as the US classification scheme for UASes and Germany isn’t doing something some unrelated-but-similarly-named classification system, it’s intended for use against fairly small drones:

    https://en.wikipedia.org/wiki/Unmanned_aerial_vehicle#Terminology

    Group 1: Max take-off weight: < 20 lb (9.1 kg)

    Group 2: Max take-off weight: > 20 & < 55






  • In August 1993, the project was canceled. A year of my work evaporated, my contract ended, and I was unemployed.

    I was frustrated by all the wasted effort, so I decided to uncancel my small part of the project. I had been paid to do a job, and I wanted to finish it. My electronic badge still opened Apple’s doors, so I just kept showing up.

    I asked my friend Greg Robbins to help me. His contract in another division at Apple had just ended, so he told his manager that he would start reporting to me. She didn’t ask who I was and let him keep his office and badge. In turn, I told people that I was reporting to him. Since that left no managers in the loop, we had no meetings and could be extremely productive.

    They created a pretty handy app that was bundled with the base OS, and which I remember having fun using. So it’s probably just as well that Apple didn’t hassle them. But in all seriousness, that’s not the most amazing building security ever.

    reads further

    Hah!

    We wanted to release a Windows version as part of Windows 98, but sadly, Microsoft has effective building security.



  • It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

    And it’s not the battery itself because I’ve tried getting new batteries for it. It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

    At least some Dell laptops authenticate to the charger so that only “authentic Dell chargers” can charge the battery, though they’ll run off third-party chargers without charging the battery.

    Unfortunately, it’s a common problem – and I’ve seen this myself – for the authentication pin on an “authentic Dell charger” to become slightly bent or something, at which it will no longer authenticate and the laptop will refuse to charge the battery.

    I bet the charger on yours is a barrel charger with that pin down the middle.

    hits Amazon

    Yeah, looks like it.

    https://www.amazon.com/dp/B086VYSZVL?psc=1

    I don’t have a great picture for the 65W one, but the 45W charger here has an image looking down the charger barrel showing that internal pin.

    If you want to keep using that laptop and want to use the battery, I’d try swapping out the charger. If you don’t have an official Dell charger, make sure that the one you get is one of those (unless some “universal charger” has managed to break their authentication scheme in the intervening years; I haven’t been following things).

    EDIT: Even one of the top reviews on that Amazon page mentions it:

    I have a DELL, that has the straight barrel plug with the pin in it. THEY REALLY made a BAD DECISION when they made these DELL laptops with that type of plug instead of making it with a dog leg style plug. I have to replace my charger cord A LOT because the pin gets bent inside and it stops charging at that plug, but the rest of the charger is still good…


  • Up until the early 2000s, serial computation speed doubled about every 18 months. That meant that virtually all software just ran twice as quickly every 18 months of CPU advances. And since taking advantage of that was trivial, new software releases did, traded CPU cycles for shorter development time or more functionality, demanded current hardware to run at a reasonable clip.

    In that environment, it was quite important to upgrade the CPU.

    But that hasn’t been happening for about twenty years now. Serial computation speed still increases, but not nearly as quickly any more.

    This is about ten years old now:

    https://preshing.com/20120208/a-look-back-at-single-threaded-cpu-performance/

    Throughout the 80’s and 90’s, CPUs were able to run virtually any kind of software twice as fast every 18-20 months. The rate of change was incredible. Your 486SX-16 was almost obsolete by the time you got it through the door. But eventually, at some point in the mid-2000’s, progress slowed down considerably for single-threaded software – which was most software.

    Perhaps the turning point came in May 2004, when Intel canceled its latest single-core development effort to focus on multicore designs. Later that year, Herb Sutter wrote his now-famous article, The Free Lunch Is Over. Not all software will run remarkably faster year-over-year anymore, he warned us. Concurrent software would continue its meteoric rise, but single-threaded software was about to get left in the dust.

    If you’re willing to trust this line, it seems that in the eight years since January 2004, mainstream performance has increased by a factor of about 4.6x, which works out to 21% per year. Compare that to the 28x increase between 1996 and 2004! Things have really slowed down.

    We can also look at about the twelve years since then, which is even slower:

    https://www.cpubenchmark.net/compare/2026vs6296/Intel-i7-4960X-vs-Intel-Ultra-9-285K

    This is using a benchmark to compare the single-threaded performance of the i7 4960X (Intel’s high-end processor back at the start of 2013) to that of the Intel Ultra 9 285K, the current one. In those ~12 years, the latest processor has managed to get single-threaded performance about (5068/2070)=~2.448 times the 12-year-old processor. That’s (5068/2070)^(1/12)=1.07747, about a 7.7% performance improvement per year. The age of a processor doesn’t matter nearly as much in that environment.

    We still have had significant parallel computation increases. GPUs in particular have gotten considerably more powerful. But unlike with serial compute, parallel compute isn’t a “free” performance improvement – software needs to be rewritten to take advantage of that, it’s often hard to parallelize solving problems, and some problems cannot be solved in parallel.

    Honestly, I’d say that the most-noticeable shift is away from rotational drives to SSDs – there are tasks for which SSDs can greatly outperform rotational drives.



  • Hmm. Fair enough.

    Looking at a couple other sources, it also sounds like ADS-B data stopped being transmitted prior to the landing. So that does seem like another data point besides the data recorder maybe cutting out arguing for some measure of electrical issues (which doesn’t necessarily mean that the electrical system is damaged, but for power not to be going to part of the plane’s systems).

    https://www.flightradar24.com/blog/jeju-air-2216-muan/

    The last ADS-B message received from the aircraft occurred at 23:58:50 UTC with the aircraft located at 34.95966, 126.38426 at an altitude of 500 feet approaching Runway 1 at Muan.

    Based on visual evidence (see video below, viewer discretion advised) and the altitude and vertical rate data received by Flightradar24, we believe that the final ADS-B messages received represent preparation for a possible flypast of the airport. A flypast is often performed to visually confirm that the landing gear is either down or not prior to making a decision on next steps. The chart below shows the altitude and reported vertical rate of the aircraft from 2000 feet to the last signal received at 500 feet.

    Post-ADS-B data

    It appears that ADS-B data was either no longer sent by the aircraft or the aircraft was outside our coverage area after 23:58:50 UTC. Based on coverage of previous flights and of other aircraft on the ground at Muan before and after the accident flight, we believe the former explanation is more likely. There are multiple possible explanations for why an aircraft would stop sending ADS-B messages, including loss of electrical power to the transponder, a wider electrical failure, or pilot action on the flight deck.

    EDIT: I did also see a pilot talking about the video and pointing out that while the crew didn’t get flaps or gear, they managed to deploy at least one thrust reverser. I’m not sure what drives that (Do you need hydraulics? Electricity?), but it might say something about what was available to them.


  • Now it looks like some electrical systems, including power to the data recorders, died right at the start of the incident, which would require not just double engine failure but failure of the APU and backup battery systems. That just seems incredibly unlikely.

    Electrical failure doesn’t absolutely require that the engines fail. Supposing you short the electrical system?

    Like, United Air Flight 232 had an uncontained engine failure that then severed all three hydraulic systems. The real problem that the pilots faced wasn’t “we’ve lost one of our three engines”, but rather the secondary damage to other aircraft systems resulting from that failure.

    I assume that there’s some level of electrical system redundancy, but then, the same was true of the hydraulic systems on UA232 – it just required a really unlucky failure, with the engine shrapnel hitting multiple things, to cause the redundancy to also be wiped out.

    looks for WP page

    https://en.wikipedia.org/wiki/Jeju_Air_Flight_2216

    Authorities said that a bird strike may have caused a malfunction that affected the hydraulic system controlling the landing gear and that there was insufficient time for the pilots to manually deploy the landing gear.

    I don’t see how a hydraulic system failure alone would have caused the flight recorder to go offline.

    But it does kinda sound to me like maybe they’re talking about a bird strike causing some kind of secondary problems. Supposing a bird strike caused an uncontained engine failure – which has happened before – and that then caused secondary problems as bits of engine severed other things in the aircraft. What if those secondary problems were electrical in nature, rather than hydraulic?

    EDIT: The landing was also apparently done without use of flaps. Looking online, it sounds like the lack of landing gear and flaps suggests that hydraulics weren’t available. But I’d guess that a loss of electrical power to the hydraulic system, rather than the hydraulics themselves failing, could also explain such a situation.

    EDIT2: If there’s power loss, some aircraft have a ram air turbine that drops down to get a small amount of electrical power. I was thinking that that might have been usable as an indicator that electrical power was gone. In the video, I don’t see that, but it sounds like a 737-800 doesn’t have one. According to this, that aircraft is also apparently capable of being controlled to a limited degree even without electrical power due to mechanical connections:

    https://aviation.stackexchange.com/questions/42565/does-a-boeing-737-800-have-a-ram-air-turbine-rat

    • If all fuel is gone and the batteries are depleted, the aircraft can be flown by hand, directly overcoming the aeroforces by pulling hard! This is called manual reversion.

    • In manual reversion, the aileron trim tabs now function as geared tabs, assisting in overcoming the aeroforces. Elevators will have high aeroforces, high friction forces, and freeplay around centre point. Stabiliser trim wheels provide additional pitch control. The rudder has no manual reversion.

    That’d permit for them being able to bring the plane down the way they did, assuming that they didn’t have electrical power.




  • I’ve kind of felt the same way, would rather have a somewhat-stronger focus on technology in this community.

    The current top few pages of posts are pretty much all just talking about drama at social media companies, which frankly isn’t really what I think of as technology.

    That being said, “technology” kind of runs the gamut in various news sources. I’ve often seen “technology news” basically amount to promoting new consumer gadgets, which isn’t exactly what I’d like to see from the thing, either. I don’t really want to see leaked photos of whatever the latest Android tablet from Lenovo or whatever is either.

    I’d be more interested in reading about technological advances and changes.

    I suppose that if someone wants to start a more-focused community, I’d also be willing to join that, give it a shot.

    EDIT: I’d note that the current content here kind of mirrors what’s on Reddit at /r/Technology, which is also basically drama at social media companies. I suppose that there’s probably interest from some in that. It’s just not really what I’m primarily looking for.



  • I think that California should take keeping itself competitive as a tech center more-seriously. I think that a lot of what has made California competitive for tech is because it had tech from earlier, and that at a certain threshold, it becomes advantageous to do more companies in an area – you have a pool of employees and investors and such. But what matters is having a sufficiently-large pool, and if you let that advantage erode sufficiently, your edge also goes away.

    We were just talking about high California electricity prices, for example. A number of datacenters have shifted out of California because the cost of electricity is a significant input. Now, okay – you don’t have to be right on top of your datacenters to be doing tech work. You can run a Silicon Valley-based company that has its hardware in Washington state, but it’s one more factor that makes it less appealing to be located in California.

    The electricity price issue came up a lot back when people were talking about Bitcoin mining more, since there weren’t a whole lot of inputs and it’s otherwise pretty location-agnostic.

    https://www.cnbc.com/2021/09/30/this-map-shows-the-best-us-states-to-mine-for-bitcoin.html

    In California and Connecticut, electricity costs 18 to 19 cents per kilowatt hour, more than double that in Texas, Wyoming, Washington, and Kentucky, according to the Global Energy Institute.

    (Prices are higher now everywhere, as this was before the COVID-19-era inflation, but the fact that California is still expensive electricity-wise remains.)

    I think that there is a certain chunk of California that is kind of under the impression that the tech industry in California is a magic cash cow that is always going to be there, no matter what California does, and I think that that’s kind of a cavalier approach to take.

    EDIT: COVID-19’s remote-working also did a lot to seriously hurt California here, since a lot of people decided “if I don’t have to pay California cost-of-living and can still keep the same job, why should I pay those costs?” and just moved out of state. If you look at COVID-19-era population-change data in counties around the San Francisco Bay Area, it saw a pretty remarkable drop.

    https://www.apricitas.io/p/california-is-losing-tech-jobs

    California is Losing Tech Jobs

    The Golden State Used to Dominate Tech Employment—But Its Share of Total US Tech Jobs has Now Fallen to the Lowest Level in a Decade

    Nevertheless, many of the tech industry’s traditional hubs have indeed suffered significantly since the onset of the tech-cession—and nowhere more so than California. As the home of Silicon Valley, the state represented roughly 30% of total US tech sector output and got roughly 10% of its statewide GDP from the tech industry in 2021. However, the Golden State has been bleeding tech jobs over the last year and a half—since August 2022, California has lost 21k jobs in computer systems design & related, 15k in streaming & social networks, 11k in software publishing, and 7k in web search & related—while gaining less than 1k in computing infrastructure & data processing. Since the beginning of COVID, California has added a sum total of only 6k jobs in the tech industry—compared to roughly 570k across the rest of the United States.

    For California, the loss of tech jobs represents a major drag on the state’s economy, a driver of acute budgetary problems, and an upending of housing market dynamics—but most importantly, it represents a squandering of many of the opportunities the industry afforded the state throughout the 2010s.