This is the lord’s work. It’s ridiculous that in 2025 my $500 gaming PC GPU cannot tell the receiver to change inputs. Even my Apple TV, which is considered a model citizen here, steals the receiver’s input every few hours if I have another device active.
Yeah, the Apple TV isn't better so much as it is very aggressive. I usually have to long press the power button on the Apple TV remote to get it to power off and let go of my receiver.
Other devices like an nVidia Shield or the XBOX require that you press power/home a couple of times to take control of the receiver and switch inputs.
The premise still strikes me as a ridiculous one: Am I possibly a more affluent customer because there is a high pile rug under the coffee table? How much would Charmin pay to know I have two rooms with tiled floors?
What iRobot actually suggested was more mundane: that there could hypothetically exist a protocol for smart devices to share a spatial understanding of the home, and that their existing robot was in a favorable position to provide the map. The CEO talking about it like a business opportunity rather than a feature invited the negative reception.
It didn't help that a few years later, photos collected by development units in paid testers' homes for ML training purposes were leaked by Scale AI annotators (akin to Mechanical Turk workers). This again became "Roomba is filming you in the bathroom" in the mind of the public.
The privacy risk seemed entirely hypothetical—there was no actual consumer harm, only vague speculation about what the harm could be, and to my knowledge the relevant features never even existed. And yet the fear of Alexa having a floorplan of your home could have been great enough to play a role in torpedoing the Amazon acquisition.
> The premise still strikes me as a ridiculous one: Am I possibly a more affluent customer because there is a high pile rug under the coffee table?
I've no idea about rug pile depth, but I'd have thought a simple link between square footage and location would be a reasonable proxy for that affluency.
Not sure that works though for flogging, say, client IP to affluency data to advertisers, unless they can already reliably pinpoint the client IP to an address (which for all I know, maybe they can).
The roombas with cameras don't need an internet connection to work-- they need it if you want the app control features like scheduling. The imagery based navigation is still local.
When I got one in ~2019, I covered the camera and connected it long enough for it to get firmware updates (which annoyingly you can't trigger and it takes a few days)... then I firewalled it off to get no internet access.
I later figured out that if you let it connect and firewall it off it just sits in a tight loop trying to connect again hundreds of times per second which meaningfully depletes the battery faster.
Changing the SSID name so it couldn't connect to the wifi solved the problem.
I'd like to get a new one-- the old one still runs well (with some maintenance, of course) but the latest robot vacuums are obviously better. Unfortunately at least some are more cloud dependent and I can't tell which are and to what degree.
Would the US security leviathan give away other people’s money for highly current floor plans of every residence in the country just on the 1-in-a-million chance they decide to kick in your door and shoot your dog? Probably.
You’re looking at this from a point where the only piece of information about you out there is the data collected by the roomba. In reality, every sensible data broker would just add that signal to your already verbose profile and feed it to a model to determine the stuff you’re likely to buy… or would trigger you to generate engagement or whatever is needed.
The privacy danger here is not the one data point, it’s the unknown amount of other parties who will mix and match it with more data.
With GDPR, I’ve been requesting copies of my telemetry from various corps and it’s amazing the kind of stuff they collect. Did you know kindle records very time you tap on the screen (even outside buttons), in addition to what you read and highlight and pages you spend time on? Now add to that your smart tv’s insights about you and your robot vacuum cleaner … you see now this all grows out of control.
I don't get this, so you're saying than they can and do sell maps of your home to the highest bidder. But... it's actually overblown, even though they're doing exactly what people were concerned they were doing?
It's MY home! I don't want anybody filming it or recording it or selling maps of it. Full stop!
> [iRobot CEO] Angle said iRobot would not sharing data [sic] without its customers' permission, but he expressed confidence most would give their consent in order to access the smart home functions.
The "sharing data" here meant sharing data with other brands' smart home devices but appears misinterpreted as "sharing data with advertisers/data brokers/etc." Say Sonos wanted to make a hi-fi system that optimized audio to your room layout based on Roomba's map.
Upon careful re-reading of the article, I think what the CEO was saying was that they were pursuing becoming the spatial backend for Alexa / Google Home / HomeKit, but the journalist wrote Amazon / Google / Apple, which makes it seem more about advertising data collection than about smart home technology.
(Evidence that this is the correct interpretation: Facebook, despite being a giant data harvesting and advertising operation, was not listed as a potential partner, because they do not have a smart home platform.)
Aside from having parts available, I was unexpectedly impressed that my RoboRock self-emptying dock (c. 2023) was clearly designed for painless serviceability. The ducts are easily accessed via removable panels, and you need only a Phillips screwdriver.
That said, the performance of the robot certainly degraded over time, and I haven't identified the cause to my satisfaction. Obstacle avoidance needs work (especially for charging cables left dangling off the couch), and the map is frustrating to edit and seems to degenerate over a 6 month period.
I work in a department that has been using ServiceNow for at least 5 years, and I still do not know how to look up a ticket by ticket number. I just pretend I'm following along when my colleagues reference a ticket.
I just spent a minute poking at it: my dashboard page didn't load, then it told me there are no open tickets in the system, then clicking on a different ticket number to open it didn't do anything, and then the server stopped responding. (Edit: it took 48 seconds to load the ticket.)
They also have a little stopwatch button on some pages that pops up a "Browser Response Time" window that tries to put the blame for slow page load times on the user's browser. Weird, wonder why they need that...
Yes! It always amazes me there seems to be no obvious URL scheme for servicenow.sadcompany.com/<ticket-number> Like, did the developers forget to implement that?
Yeah, and there's no search field, either. Surely, this is my misunderstanding and I should click the "Show Help" icon for a product tutorial, right? This pops up a window saying:
> Now Assist offers real-time guidance and support for users seeking help with Virtual Agent. This feature’s generative AI skills blah blah blah
Ok...? There is no input box to interact with "Now Assist" or the "Virtual Agent", it's just like a marketing blurb for some other feature.
F500, we have a pretty custom ServiceNow, but all I do is put the ticket or any other identifier in the search box and go. Takes 2 seconds to be in the ticket. Granted, that interface sucks too, but I suspect your main problem is internal to your org and the people that configured your ServiceNow.
Your system was configured by muppets if you don’t have a search box - it’s a massive beast that like all enterprise-grade software is a toolbox for you to bend to your will, but the downside is that if your configuration people don’t have empathy for the users (and looking at you especially, contract architects) you end up with a system that is optimised for whoever talks with the vendor, and not for anyone else.
What? Unless someone actively removed the search field, you should have quite a big search field in the top right corner, where you can basically search for anything you'd need.
> my dashboard page didn't load, then it told me there are no open tickets in the system, then clicking on a different ticket number to open it didn't do anything, and then the server stopped responding.
Like all SaaS in-house implementations, this is entirely on how your company's ServiceNow developers.
I've worked on multiple SNOW implementations and things can go really bad when you go crazy with the customizations.
Your comment makes me understand the product even less. So it’s SaaS where you have to develop it yourself? What exactly is the company providing? Why do its customers simultaneously want to outsource this to a vendor and then spend resources customizing it down to the level of “basic CRUD operations work” and “the user sees a search field”?
ServiceNow is a platform-as-a-service (PaaS), not a SaaS, that allows development of new products on top of it.
At its core, there is a workflow management engine that third parties can use to implement their own, stateful, process centric products and services.
We have ServiceNow proper (the CRM) and a completely unrelated to CRM third party product that we have purchased and which is implemented on the ServiceNow platform. Both have nothing to do with each other and are used by different business users.
You don't develop it, you develop on it. SN provides the underlying software, implementations, hosting, upgrades, etc. Salesforce is another example of this.
Not to say that ServiceNow is great, but not being able to type the number into the search bar (top right) sounds more like a user issue than anything else.
A guest logged into Wi-Fi on a Vizio of mine and there was conveniently no way to disconnect/forget it without a factory reset back to motion smoothing hell.
Change your network name. When the TV prompts you to connect, join the renamed network. Then, rename it back so everything else can connect again and the TV can't. I can think of a few potential problems with this, but, it might work?
Or blacklist the TV's MAC address in your router settings. Didn't think of that first for some reason.
You gave me flashbacks to my Samsung washing machine that needed a factory reset after changing my SSID. Which also reset the service life of filters and liquids and such which was somewhat of a hassle. Such a dumb design not being able to change the wireless network.
Does the search feature work for you? Mine gives me about 2 seconds to enter a search term, but once it has fetched results for the partial input, it keeps reverting the search field as I try to enter more, even if I delete to try again.
I would not be surprised if there is no QA team for the tvOS app.
Awesome. I refer to https://bourdain.greg.technology/#food-im-thinking-about about once a year. One of my favorite vacations was going to a different hawker stall on his list each night in Singapore. Unsurprisingly, his picks are all pretty good, and #1 is justified in crowning the list.
Also for general bourdain tourism- eat like bourdain is a really passionate and fleshed out blog that tells you where and what he ate in each city/country.
I use it pretty frequently.
Chicken and rice is anything but bland. I haven't had Hainanese style but the Thai style khao man gai that Nong's serves in Portland is a flavor that I still remember more than a decade later.
chicken and rice has oil and some savoriness but it's not jacked to the tits with spice like an indian curry or any thai food - in that regard, compared to other asian cuisines, yes it is bland. compared to midwest mac & cheese, sure, maybe it's less bland but even then I bet a midwesterner could pleasantly eat the dish where they would be on the struggle bus eating indian food
The chicken is indeed bland, although the non-canonical roasted version is more flavorful than the traditional poached one. The rice, which is cooked in chicken stock and spices, is anything but, but it's the fresh chili sauce that really makes it zing, in the same way that wasabi makes "bland" sushi work.
Tian Tian is overrated and not worth the lines though. Every Singaporean has their favorite but I like Loy Kee, partly thanks to their amazing slogan, "Chicken Lickin' Good".
I made a brief attempt of splitting each character into a separate <span style="transform: scale(<random>, <random>)">c</span>, but it doesn't look good because the transform is applied after the glyph is rasterized. I didn't see a way to scale the font size itself in two different axes, and applying a single scaling factor of 97-100% does not perfectly recreate the effect. text-rendering: geometricPrecision probably helps.
I'm not a frontend developer, I knew about ::before and ::after, but just learned about adjacent sibling combinator +, general sibling combinator ~ and :has() after reading your comment. Maybe every character in the text could be wrapped in a <span> via Javascript where the class name is the unicode value (in hex, say). Then css could tighten the spacing and simulate kerning for certain character combinations:
text:
it
html:
<span class="69">i</span><span class="74 sarcastic">t</span>
css:
/* could also use ch or ex instead of em */
.69 + .74::before {
margin-left: -0.1em;
}
.sarcastic {
transform: skewX(-10deg);
}
/* loosen spacing a bit for certain randomness */
.69 + .74.sarcastic::before {
margin-left: -0.05em;
}
Maybe the type of randomness applied could be set as additional classes on the character, limited only by imagination (I added .sarcastic as an example). Maybe AI could be trained on sample text to tidy up the kerning for a large number of permutations, althought the generated css could get quite large.
I asked AI if there's a way to apply css to specific characters instead of selectors, but unfortunately that doesn't seem to be possible (yet). It feels strange to live in a world where I could have just asked AI to do all of this for me in an online sandbox in less time than it took me to write this comment :-/
The technique applied is not randomly selecting a different typeface per paragraph, but tweaking the glyph shapes when a character is repeated. Glancing at the LibreOffice extension, it seems to slightly vary CharScaleWidth by 90–110% and CharEscapementHeight by 97–100% of the base height.
reply