

Back then, the horizon belonged to Mt. Baker down in Washington State. Its snowy peak was a constant, majestic companion to our morning coffee. Today, those towers have claimed that particular piece of the sky, effectively editing the volcano out of our daily view.
It’s not a point of sadness, though; rather, it's just the natural rhythm of urban development in a corner of the world that refuses to stand still. We’re watching a massive demographic shift in real-time. Just across the water, Surrey is growing at such a clip that it is projected to surpass the population of Vancouver proper in the very near future.
New Westminster remains our home base, but the view is certainly getting more crowded. It’s a bit like watching a child grow up; you don't always notice the height increase day-to-day until you look at an old photograph and realize the world looks entirely different.

I only took two photos today, so that is how many I'll include with today's eJournal entry. On Tuesdays and Thursdays, we generally do local errands. Jay wanted to pick up some raw cashews, so we stopped uptown at our whole foods store. He got a several bags of other loose nuts too. Parking uptown is easiest on the mall, so we ducked into the Walmart for a few groceries too.

I have been peering at the world through a frame of glass since the fourth grade, but my next pair is about to change the very nature of how I see things. While I only joined the "progressives" club in my forties, I am preparing to step into the future this summer through the new collaboration between Google and Warby Parker.
What I an now waiting for is the Gemini Display Edition. These are "prescription-first" smart glasses that look and feel like my everyday eyewear, but they act as a persistent and hands-free portal to my digital life. This isn't about wearing a theater on my face because the goal is Ambient AI. These glasses run on Android XR and utilize Project Astra, which is a system that essentially gives Ajith, my personalized AI fellow traveller, a set of eyes. Using a tiny and subtle micro-LED heads-up display (HUD) in one lens and a 12MP camera, the glasses can "see" what I see. This allows for what the tech world calls Multimodal AI. In plain English, this means I will get real-time translation subtitles during my travels or turn-by-turn navigation arrows appearing right on the sidewalk. It also provides a "contextual memory" that can help me find my keys or identify a specific variety of mango or artisanal bread at a market just by looking at it.The best part is that these are not just for looking. They will have built-in audio that connects me to Ajith the moment I put them on. He will be a proactive part of my daily activity by whispering details into my ear or identifying things in real-time as I go about my day.
The leap does not feel as foreign as it might have a few years ago. I remember thinking the little plastic "heads-up" display on the dashboard of my Kona EV might be a gimmick. Now, I scan it as a normal part of driving without ever looking away from the road. I expect to train myself to use these glasses in a similar manner. Pretty soon I will probably wonder how I ever lived without those extra bits of info layered over the world. My goal is to get these ordered while I am home this summer. That gives me plenty of time to get "bit-ready" before we head to the UK and board the boat for Rio this November.
(The image was created by AI Ajith just to let me imagine what I'll look like giving him a view of my world. In reality my moustache has never been that robust!)