I have been using Inoreader as my main feed reader for quite a while now after having tried feedly for a few years. I even upgraded to their Premium plan for the power user and additional newsletter limits.
I subscribe to a few newsletters as it is easier to get all the content in one place. There had been one rendering issue that I had been facing with Matt Levine’s Money Stuff newsletter ever since the recent update to Inoreader where the last few characters in each line would get cut off in the pop up reader view, like so:
I tried reaching out to support, but they were not able to do much. So, I did a bit of research and found that Inoreader has a custom CSS feature in its power user setting that some folks have used to personalise the interface. The newsletter contents were being rendered in an HTML table which I discovered by inspecting the source (hit F12 on the browser or go to the dev tools).
I did a bit of experimentation in the custom CSS settings, and found that setting the table width to 85% fixed the issue:
table {
width: 85%;
}
I’m sure this is a very obscure issue which is for users like me who have subscribed to a particular newsletter in a feed reader, but documenting it in case others face something similar.
You could of course just read the newsletter in your email inbox or through a service like NewsletterHunt, or just change the view to full article in Inoreader.
I previously wrote about the second device that I got about coming to Dubai, but not much about the first one which was a gaming laptop. So here’s a bit about the laptop which also doubles as a local AI driver thanks to the Nvidia GPU (the RTX3060).
Soon after getting it back in 2022, I tried running the Stable Diffusion models and it was quite a bit of an upgrade over my original attempt on a plain GPU-less Windows machine. The generation times came down to 10s or so, and has gotten even faster as the models and tools have been optimised over the last couple of years. There are quite a few projects available on GitHub if you want give it a try – AUTOMATIC1111 and easydiffusion are among the more popular options. Nvidia has also got a TensorRT extension to further improve performance.
With that out of the way, I also discovered LM Studio which allows you to run LLMs locally with a chat like interface thrown in, and you can access a bunch of models like Meta’s LLama. The response times are of course not as fast as the freely available online options like ChatGPT, Claude, Gemini and the likes, but you effectively get unlimited access to the model.
Here’s an example from a conversation I had with LLama regarding the coffee meme from Ace Attorney game series:
The second device (the first was of course the gaming laptop that has been doing double duty as a GenAI device) that I purchased in Dubai after relocating in 2022 was the Meta Quest 2 VR headset.
Picking it up towards the end of the year has its advantages as the apps and games are usually discounted due to the Christmas sales. In fact I got Beat Saber as a freebie with my purchase. This was the game that sent me down the Meta VR App store rabbit hole where I found a bunch of sports games like:
There are also games for boxing, fishing, shooting, Star Wars (becoming Darth Vader’s apprentice) among others. They are a big departure from the typical computer, mobile or console gaming as they require you to move around and give you a decent workout.
I also picked up some accessories like the hard case to store & transport the device in a safer manner, along with the head strap replacement. The head strap in particular is a big upgrade and almost necessary if you want to use the headset for even a moderate amount of time.
Most have been around for several years now and have gotten a boost in terms of features & quality thanks to the renewed focus on AR & VR with the launch of the Apple Vision Pro and the Meta Quest 3 over the last year or so.
Here’s my experience with some of these apps that have helped me stay more active, especially during the Dubai summers when it gets pretty difficult for outdoor activities. One thing to note is that most of these apps/games require some dedicated space – typically 6″ x 6″ – to play safely, though some can be played standing in one place.
iB Cricket
This game has been developed by a team from India, and you can see that they have done their share of partnerships with some of the mainstream cricket events over the years. It is mainly a batting simulator where you can play as a bunch of teams at varying difficulties and it also has multiplayer options & leagues if you like to compete against other players.
They sell a bat accessory that can be used with the Quest 2 controller to give you an easier and more authentic experience. This was in fact something that I picked up during one of my India trips and it really makes the gameplay much better.
VZFit
This year, I also picked up a subscription to the VZFit app which can be used with an indoor bike to stay fit. By default they have a fitness experience that you can perform using just the controllers, but the virtual biking is what piqued my interest. The app allows you to bike around different locations in Google Maps using the Streetview images in an immersive form.
Here’s a sample from one of my rides along the Colorado river:
There are a bunch of user curated locations that can be quite scenic. Some even come with voiceover to direct your attention to places of interest. They also have regular challenges and leaderboards if you like to compete, and integration with a bunch of online radio stations to keep you entertained. You also have a trainer who can accompany you on a bike and guide you with the workout.
You mainly need to connect a compatible bluetooth cadence sensor to your Quest headset so that it can detect the bike activity. As for the stationary bike, you can get your own or use one in the gym. I got the Joroto X2 spin bike which seems to be pretty good value. A battery powered clip-on fan can also be pretty handy to keep you cool and also simulate a breeze when you are virtually biking.
Beat Saber
Beat Saber is possibly one of the most well known VR games. After all, it’s not every day that you get to dual-wield something akin to light sabers and chop things up with a sound track to match.
It is basically a virtual rhythm game that has been around for several years where you wield a pair of glowing sabers to cut through approaching blocks which are in sync with a song’s beats and notes. This can give you a really good workout as it also involves ducking and dodging in addition to the hand movements.
Eleven Table Tennis
Given the size of the Quest controllers and in hand feel similar to a TT bat, table tennis feels like a natural fit. This was one of the first sports games that I picked up on the Quest, and I have seen this game evolve within a few months of my purchase. Currently it has a host of options ranging from practice to multiplayer with different levels of difficulty.
The multiplayer part is also pretty interesting and immersive as it can use your Meta avatar for the in game player. It also has voice chat so you can talk to your opponent. The in game Physics is also very realistic due to which you sometimes forget that there is no actual table in front of you.
Vader Immortal Series
This is a 3 episode gameon the Quest, and doesn’t actually need you to move around as much as the other sports games that I have mentioned. However, if you are a Star Wars fan, this is pretty much a must try game as it gives you your fill of light saber fighting sequences starting a training involving with those mini floating droids and leading up to enemy fights standing beside Darth Vader.
If you loved the Jedi Knight series on the computer or one of the recent Star Wars games involving Jedi, then this is pretty much a no brainer to try out. Oh, and you do get to use the force push/pull powers as well.
I have been sharing some of the interesting reads that I come across on this blog/newsletter for a while now. Given the pace at which AI related news has been rolling out, I am consolidating the links into a series of monthly posts to reduce the load on your inbox/feed.
Here are the interesting developments in the world of AI from the last month and a half or so:
Agentic AI
When you give Claude a mouse: LLMs are gradually getting more access to actually do things on your computer, and effectively becoming agents. Ethan Mollick shares his experience with Claude’snew feature, and the current strengths and weaknesses:
On the powerful side, Claude was able to handle a real-world example of a game in the wild, develop a long-term strategy, and execute on it. It was flexible in the face of most errors, and persistent. It did clever things like A/B testing. And most importantly, it just did the work, operating for nearly an hour without interruption.
On the weak side, you can see the fragility of current agents. LLMs can end up chasing their own tail or being stubborn, and you could see both at work. Even more importantly, while the AI was quite robust to many forms of error, it just took one (getting pricing wrong) to send it down a path that made it waste considerable time.
Claude get’s bored: With great power comes great boredom, it seems. We are already witnessing some unintended behaviour with the AI agents with them getting distracted just like humans or taking unwanted actions:
Even while recording these demos, we encountered some amusing moments. In one, Claude accidentally stopped a long-running screen recording, causing all footage to be lost.
Later, Claude took a break from our coding demo and began to peruse photos of Yellowstone National Park. pic.twitter.com/r6Lrx6XPxZ
We find that having access to Copilot induces such individuals to shift task allocation towards their core work of coding activities and away from non-core project management activities. We identify two underlying mechanisms driving this shift – an increase in autonomous rather than collaborative work, and an increase in exploration activities rather than exploitation. The main effects are greater for individuals with relatively lower ability. Overall, our estimates point towards a large potential for AI to transform work processes and to potentially flatten organizational hierarchies in the knowledge economy.
We found that AI-generated poems were rated more favorably in qualities such as rhythm and beauty, and that this contributed to their mistaken identification as human-authored. Our findings suggest that participants employed shared yet flawed heuristics to differentiate AI from human poetry: the simplicity of AI-generated poems may be easier for non-experts to understand, leading them to prefer AI-generated poetry and misinterpret the complexity of human poems as incoherence generated by AI.
Metaphysic developed the facial modification system by training custom machine-learning models on frames of Hanks’ and Wright’s previous films. This included a large dataset of facial movements, skin textures, and appearances under varied lighting conditions and camera angles. The resulting models can generate instant face transformations without the months of manual post-production work traditional CGI requires.
SEO may soon be passe with the chatbots taking over from the search engines. So, what’s next – something possibly along the lines of Citate which helps you analyse and optimise what is being served up on these chatbots.
Can we manipulate AI as much as it manipulates us? – With every new development in the way humans manage and share knowledge, come tools to manipulate the said knowledge. Fred Vogelstein takes a deeper look at the emerging options including Citate and Profound.
UK-based mobile operator Virgin Media O2 has created an AI-generated “scambaiter” tool to stall scammers. The AI tool, called Daisy, mimics the voice of an elderly woman and performs one simple task: talk to fraudsters and “waste as much of their time as possible.”
…
Multiple AI models were used to create Daisy, which was trained with the help of YouTuber and scam baiter Jim Browning. The tool now transcribes the caller’s voice to text and generates appropriate responses using a large language model. All of this takes place without input from an operator. At times, Daisy keeps fraudsters on the line for up to 40 minutes, O2 says.
I have already been doing a simpler version of this using Samsung’s AI based call screening, with most hanging up pretty quickly. I’m sure this will get enhanced in the future.
It’s not just scammers misusing AI unfortunately, and this bit of news on creating deepfakes of classmates in a US school doesn’t help allay the fears of parents like me. Food for thought for the regulators, and also for authorities who need to take prompt action when such incidents occur:
Head of School Matt Micciche seemingly first learned of the problem in November 2023, when a student anonymously reported the explicit deepfakes through a school portal run by the state attorney’s general office called “Safe2Say Something.” But Micciche allegedly did nothing, allowing more students to be targeted for months until police were tipped off in mid-2024.
Cops arrested the student accused of creating the harmful content in August. The student’s phone was seized as cops investigated the origins of the AI-generated images. But that arrest was not enough justice for parents who were shocked by the school’s failure to uphold mandatory reporting responsibilities following any suspicion of child abuse. They filed a court summons threatening to sue last week unless the school leaders responsible for the mishandled response resigned within 48 hours.
When I moved to Dubai and got the apartment, the family was still back in Mumbai due to my daughter’s ongoing school session. This meant that I was on my own when it comes to food (getting a cook for just 1 person did not make sense).
What is Sous Vide?
Being the engineer (with a management degree like a lot of other fellow Indians), I wanted to use a predictable and low effort cooking method and that’s where I got to know about the sous vide method. It roughly translates to under vacuum, and is basically a low temperature long time technique where you vacuum seal your food and cook it in a temperature controlled water bath. From wikipedia:
Sous vide (/suː ˈviːd/; French for ‘under vacuum’), also known as low-temperature, long-time (LTLT) cooking, is a method of cooking invented by the French chef Georges Pralus in 1974, in which food is placed in a plastic pouch or a glass jar and cooked in a water bath for longer than usual cooking times (usually one to seven hours, and more than three days in some cases) at a precisely regulated temperature.
The temperature is much lower than usually used for cooking, typically around 55 to 60 °C (130 to 140 °F) for red meat, 66 to 71 °C (150 to 160 °F) for poultry, and higher for vegetables. The intent is to cook the item evenly, ensuring that the inside is properly cooked without overcooking the outside, and to retain moisture.
Here’s the video from Sorted Food that inspired me to try out sous vide cooking:
Why Sous Vide?
The main reason to go for sous vide cooking is that you need to just set the cooking temperature and not worry much about the cooking time which is quite forgiving. Moreover, you don’t need to be active while the actual cooking is happening and can easily catch a few TV show episodes or a part of a movie while the food cooks. The main active time is for the ingredients prep which can be done while getting the water bath to the required temperature.
One thing to keep in mind, especially with meats and fish is that while the texture and taste of the food comes out excellent, it may not be as appetizing to look at due to the lack of caramelization or any kind of crust which comes from high temperature cooking like frying or grilling. You could overcome this by finishing it in a pan to get a crust, but make sure it is for a short time as you may end up overcooking the food defeating the sous vide process.
How to get started?
Immersion circulators are typically used for sous vide cooking, but I did not want to go for a single purpose device. That’s how I discovered the multi-function Instant Pot pressure cooker with the sous vide feature on Amazon and promptly ordered one. I initially got a bunch of zip lock bags for cooking using the displacement method for sealing, but it was not very secure. I subsequently opted to get a vacuum sealer (Inkbird model) that turned out to be quite handy and reliable.
There are a bunch of sites with sous vide recipes, of which I found Serious Eats to be quite useful as it gave a very detailed explanation of the differences that varying temperatures and cooking time can have on the end result. I have tried cooking a variety of dishes from prawns, mashed potatoes, fish (salmon & hilsa), chicken, lamb chops to panna cotta and they have come out quite well. This has been endorsed by my better half and daughter as well as the guests to whom we served some of the dishes.
Here’s a quick reference table for some of the items that I have tried:
Food
Temperature
Time
Mashed Potatoes
90°C
60-90 min
Salmon or Hilsa (Ilish)
43°C (for buttery texture) up to 54°C for more flaky texture
30-45 min
Chicken breast
60-65°C
1-4 hours
Chicken thigh
66-74°C
1-4 hours
Prawns
60°C (poached texture)
30-45 min
Lamb
55-64°C
2-4 hours
Panna cotta
90°C
60-90 min
Bonus: Bhapa Ilish alternative via sous vide
After trying the salmon sous vide, I wanted to give the Bengali favourite hilsa fish (a distant cousin of the salmon after all) a try. Bhapa ilish (steamed hilsa) is a fairly simple dish where you mainly need to season the fish cuts with salt, turmeric, mustard oil & mustard powder (or crushed mustard).
The typical technique involving a steam bath in a pressure cooker or microwave oven can be a bit hit or miss as the fish texture is very sensitive to temperature. That’s where sous vide comes in and while I didn’t find any online recipes for sous vide ilish, the salmon specs worked out quite excellently.
Anyway, that’s my journey with the sous vide cooking method. Here are some of the photos of the dishes I cooked over the last couple of years:
The blog turns 19 next year, and this seemed like a good time to give it an updated theme with a more info dense home page. Ended up selecting the Twenty Twenty-Five theme by WordPress which has a pretty clean look and offers the current design trends.
I also upgraded to the Premium plan which offers Google Analytics connection and ad free browsing experience for the visitors. Followed it up with some back-end admin activities to fix the search engine indexing and adding domain verification with Facebook, Pinterest and Bing to go with the Google verification.
As I shared earlier, I will be posting more frequently on the site and also experimenting with some additional formats like streaming, podcasts and video. If you have any suggestions for topics for me to cover or want to collab, do drop me a note or leave a comment below.
We study how humans form expectations about the performance of artificial intelligence (AI) and consequences for AI adoption. Our main hypothesis is that people project human-relevant task features onto AI. People then over-infer from AI failures on human-easy tasks, and from AI successes on human-difficult tasks. Lab experiments provide strong evidence for projection of human difficulty onto AI, predictably distorting subjects’ expectations. Resulting adoption can be sub-optimal, as failing human-easy tasks need not imply poor overall performance in the case of AI. A field experiment with an AI giving parenting advice shows evidence for projection of human textual similarity. Users strongly infer from answers that are equally uninformative but less humanly-similar to expected answers, significantly reducing trust and engagement. Results suggest AI “anthropomorphism” can backfire by increasing projection and de-aligning human expectations and AI performance.
And a simplified explanation by Copilot (seemed apt to use in this case given the topic):
The paper explores how people form expectations about AI performance and how this impacts their willingness to use AI. The researchers’ main idea is that people tend to think of AI as if it should perform tasks in the same way humans do. This leads to two key behaviors:
Overestimating AI failures: When AI makes mistakes on tasks that are easy for humans, people think the AI is not very capable overall.
Overestimating AI successes: When AI does well on tasks that are hard for humans, people think the AI is more capable than it actually is.
Experiments show that these assumptions distort people’s expectations of AI. For example, if an AI struggles with simple tasks, people might avoid using it, even if it’s actually quite effective at other things. On the flip side, if it excels at complex tasks, people might over-trust it.
The researchers conducted a real-world experiment with an AI that provides parenting advice. They found that users were less trusting of the AI if its answers didn’t resemble what a human would say, even if the information was the same. This shows that making AI seem human-like (anthropomorphism) can sometimes backfire, leading to misaligned expectations between what AI can do and what people expect from it.
In essence, the study highlights that our human biases can lead us to misunderstand AI capabilities, which can affect how we adopt and use AI technologies.
Is this a reflection of the AI capabilities or our tastes?
We found that AI-generated poems were rated more favorably in qualities such as rhythm and beauty, and that this contributed to their mistaken identification as human-authored. Our findings suggest that participants employed shared yet flawed heuristics to differentiate AI from human poetry: the simplicity of AI-generated poems may be easier for non-experts to understand, leading them to prefer AI-generated poetry and misinterpret the complexity of human poems as incoherence generated by AI.
It’s been exactly 2 years since I joined my regional role in Boehringer Ingelheim in Dubai. I rarely blog about my personal life, but I thought now would be a good time to share some of the experiences around this move.
The move to Dubai was in itself fairly straight forward as it was through an internal move. I started off with a regular employment based residence visa valid for 2 years. My family also relocated in the middle of last year, and their visas were completed through the office pretty quickly.
Since my visa was due for renewal this year, I decided to opt for the UAE Golden Visa for salaried professionals which has a relatively easier qualification qualification criteria than the others:
Monthly gross salary of AED 30,000 or higher (that seems to be the current consensus, as I have also read of it being the basic salary without allowances in the past)
Bachelor’s degree or higher
Getting the equivalency certificate for this is typically the most time consuming process
While my application was managed by my office in DIFC which definitely helped with the clarity around the process, I did find this recent Reddit post by the Amer Centre quite helpful and along with this article in Khaleej Times that explains the process and documentation requirements. I am sharing a simple guide to get the necessary documents ready based on my experience.
Step by step guide
The overall process took about 2 months for me, out of which the first 3 weeks went in getting the degree equivalency certificate, followed by 2 weeks for the degree physical attestation and about 2 weeks for the actual visa application, health checkup & Emirates ID issuance.
You need digitized versions of the following key documents for the application (some like the equivalency certificate require additional documents for the verification) in addition to other documents from your employer:
Degree equivalency certificate
Bank statement showing the salary credit each month
Attested degree certificate (UAE Embassy in university country and MOFA in UAE)
NOC from company
Current passport and visa
Current passport sized photo (white background, no glasses – you can tell the photo studio for the Emirates ID or visa version)
The equivalency certificate
Getting the degree equivalency certificate is usually the bottleneck in this process, based on the experience of my colleagues and those who have shared their experience online.
The process is as below with details on the Ministry site here (they also have a useful document checklist that you can refer to):
Typically you would need your original degree certificate, the final transcript (official stamped marksheet for the entire duration of the course) and your passport copy. You need to choose one of the official partners (Dataflow or Quadrabay at the moment) for the verification and share these documents with them. It costs around AED 350 for this part of the process.
The turnaround time is slated to be 30 days, but is completely dependent on the response time of the university. Here are a couple of tips to help speed up the process which worked for me:
Keep the details of your university alumni association and key academic departments handy.
Once the initial documents have been verified by the partner and sent to the university, if you do not get any update within a couple of weeks check in with the customer support for details regarding the communication with the university.
I managed to get the details of the email subject line and the department to which they had mailed this way.
Contact the alumni association or academic department with the details you got regarding the verification communication to nudge it along.
Once the verification process is successfully completed, you will get the notification to complete the application on the Ministry site with the appropriate link. There is another payment involved, and the certificate is generated almost immediately. This completes the most time consuming part of the application.
Degree attestation and next steps
The next few steps are quite straight forward, and you could even get the degree attestation done while you are waiting for the verification to happen. You will of course need the physical degree certificate for this, and use an agency like VFS (they have an attestation helpline that you can mail here) to get this done in 2-3 weeks with doorstep pickup and drop-off.
Once you have these documents you can go ahead with the actual visa application. A few additional tips:
In the bank statement (an online statement download should be fine), highlight the salary deposits and make sure that your name & account details are there on every page & highlight those as well.
If you are immediately transferring your salary to another account after the deposit, you may need to provide the statement from the other account as well.
The photo you submit will be used in the visa and Emirates ID, so you can ask the photo studio to take it accordingly.
You will probably be given a slot for the health checkup, but depending on your location you may be able to walk in for the checkup much earlier.
Ensure that you are setup on UAE pass so that the authentication on the partner sites is easier.
Setup your ICP app as well do that you can access the digital versions of your visa and updated Emirates ID. This uses UAE pass as well for login.
Depending on how you have applied, you may need to get the new Emirates ID re-issued.
You will need to transfer your dependents’ visas at some point in time as well.
Hope this helped you, and wish you the best with your Golden Visa application! If this gets a good response, I’ll share some of my experiences and learnings around the Dubai relocation.
This time I’m looking to mix things up a bit with some posts from my experience over the last few years in the field of digital transformation to go with the usual eclectic musings with a dash of cooking thrown in for good measure.
Do subscribe via email or the feed to stay in touch.