The online home of Andrew Joyce

Adding the Wyze Bulb Cam to Apple Homekit using AI

I have a love-hate relationship with smart devices. I appreciate how much they do for me, but then when the power goes out half of them don’t work without being reset. I recently switched routers and found out that Netgear is much worse than Google at papering over the problems that can arise with simple smart home devices like bulbs, switches, and plugs. All this to say: when I die, someone check on my wife and give her a crash course in Netgear router config.

I bought a Waze Bulb Cam to replace my Ring doorbell that we’ve had for years. Amazon bought Ring almost simultaneously with us buying Ring tech, and Amazon is my least-favorite tech company, and the one I trust the least with my data.

But, I’m nothing if not slapdash, so I bought the Waze Bulb Cam because it was on sale and it wasn’t Amazon. Side note, if anyone knows something about isolating cameras from ‘calling home,’ I’m all ears. Our other cameras are Eufy which seem pretty sanitized and limited to our local network (with SD card recording), but Eufy didn’t have a solid outdoor camera that fit the bill. I don’t know how bad Wyze is at calling home, but I’d rather almost anyone else be snooping on me than Amazon.

At least we have a light bulb socket out front! I wanted a powered camera because I’d gotten sick of replacing the Ring battery every other week (7 year old battery, go figure). I thought (famous last words) that I would be able to use Homebridge to get the Waze camera into HomeKit. There was a plugin and everything. So after some quick google searches I bought the Bulb Cam.

Unfortunately, the Bulb Cam isn’t supported. Like at all. As far as I know, I may be the only person in the universe who even bought one of these lightbulb cameras. They are kinda funky.

The Setup

There is no hope from Wyze’s end. The feature request for HomeKit integration has been open since 2018, has 970 replies, 133,000 views, and Wyze has basically said, “forget about it.” They further said that IF they do HomeKit integration, it wouldn’t be for camera streams.

(And if you want a Matter camera, there aren’t many to speak of that are affordable, and none that screw into a lightbulb. I guess I got what I paid for).

Anyways. I had a Bulb Cam for $50, and no way to actually access the video except in Wyze’s app. That bothered me for some reason, and I really didn’t want this camera to turn into e-waste, even though Wyze itself probably had nothing more in mind than selling a few to be landfill fodder in a year or two).

This is destined to fail, isn’t it?

I knew if I could get the camera to broadcast RTSP video, there were hooks to pull that into Homebridge, specifically, with the homebridge-camera-ffmpeg plugin (https://github.com/homebridge-plugins/homebridge-camera-ffmpeg).

The catch would be getting the Bulb Cam to actually be picked up and converted to an RTSP stream.

This post is the technical writeup, including a Docker networking bug that wasted hours of my life and may be the actual reason this camera is reported as broken everywhere on the internet.

Dead Ends

When I googled ‘Waze cameras in Homekit’, this is what I turned up:

  1. homebridge-wyze-smart-home plugin — handles bulbs, plugs, sensors. Cameras only get a basic on/off switch, no video. The beta version (2.0.0) supposedly adds support for camera video, but not the Bulb cam. I did install this and try this and it didn’t recognize my camera.
  2. mrlt8/docker-wyze-bridge — extracts video and republishes it as an RTSP stream. This only works with Wyze’s camera that use the TUTK protocol. This works for the V2/V3/V4/Doorbell, but issue #1480 flags the Bulb Cam as “not supported.”
  3. Cryze — for “Gwell” cameras (OG, OG Telephoto) that use a different SDK. This is super overkill (Claude suggested it): basically you run Android in a docker container, with a custom Android app that fakes the Waze app entirely, and then republishes the video to an RTSP stream. Apparently the Raspberry Pi OS doesn’t have the components or juice to do this justice.
  4. Custom firmware (Thingino, wz_mini_hacks) — only works on Ingenic-based cameras (T20/T31/T41 SoCs): there’s nothing for the Bulb Cam.

I couldn’t even get a straight answer on what protocol the Bulb Cam uses. Some sources said Gwell, some said TUTK. I eventually learned that it’s TUTK (kind of), with some port weirdness.

What worked

I found a complete Go rewrite of docker-wyze-bridge by IDisposable. Unlike the python version, this bundled support for TUTK/Gwell/WebRTC cameras all in one plugin, so if I was going to have a shot, this would be it.

With Claude’s help, I setup this basic workflow:

Wyze Bulb Cam (HL_BC, 192.168.0.32)
        ↓ TUTK P2P over UDP (port 32761)
docker-wyze-bridge v4.3.0 (Go rewrite)
        ↓ embeds go2rtc 1.9.14
go2rtc (wyze:// source → RTSP republish)
        ↓ rtsp://VM:8554/wyze_bulb_cam
homebridge-camera-ffmpeg (on Raspberry Pi)
        ↓ HomeKit bridge
Apple Home app

I didn’t want to overload the Pi (it’s also running pi-hole and a plex media server, in addition to my Homebridge) so I installed the docker-wyze-bridge on my Immich server, which is a Linux virtual machine running inside of an old Windows 11 box. I didn’t have networking problems with Immich, so I foolishly thought this would be similar (the virtual machine is what ended up tripping me up here).

Claude walked me through the setup and we got everything rolling, except I got this error on every play attempt:

wyze: connect failed: discovery timeout

I tried a lot of AI-suggested troubleshooting:

Claude concludes at this point: “the Bulb Cam’s IOTC handshake is genuinely incompatible with go2rtc’s wyze source. Time to file a GitHub issue and move on.

Instead, I asked Claude, “what would a kernel developer do at this point? Suppose we are someone trying to build out support for the Bulb Cam, what would we do?” The answer: packet-sniffing. It’s time to start trying to reverse-engineer this.

sudo tcpdump -i any -w /tmp/bulbcam.pcap host 192.168.392.1

Claude then looked at the IPv4 conversations in Wireshark. At this point, troubleshooting the pcap, we are beyond any sort of knowledge that I have or could hope to acquire.

Claude’s Diagnosis

(This section of this post was composed with AI when I asked Claude to explain its work).

Looking at the IPv4 conversations in Wireshark:

Endpoint pairOutboundInbound
Container (172.19.0.2) ↔ camera560 packets0 packets
Host (192.168.0.249) ↔ camera300 packets140 packets

The camera was responding! 140 packets of replies were arriving at the host. They just weren’t reaching the container. The first 8 frames tell the whole story:

1   172.19.0.2     → 192.168.0.32   UDP  src=51421 dst=32761  (container sends discovery)
2   172.19.0.2     → 192.168.0.32   UDP  src=51421 dst=32761
3   192.168.0.249  → 192.168.0.32   UDP  src=51421 dst=32761  (after NAT, leaving host)
4   172.19.0.2     → 192.168.0.32   UDP  src=51421 dst=32761
5   172.19.0.2     → 192.168.0.32   UDP  src=51421 dst=32761
6   192.168.0.249  → 192.168.0.32   UDP  src=51421 dst=32761
7   192.168.0.32   → 192.168.0.249  UDP  src=48745 dst=51421  (camera responds!)
8   192.168.0.249  → 192.168.0.32   ICMP type 3 code 3        (host: "no listener on 51421")

The camera responds from a different source port (48745) than the request hit (32761). Linux’s nf_conntrack_udp doesn’t recognize this as a valid reply for the existing flow, so it has no NAT mapping to apply when forwarding the packet back into the container. The packet hits the host’s network stack where nothing is listening, and the host responds with ICMP Port Unreachable.

That ICMP unreachable, repeated 20+ times over the capture window, is the real reason for the “discovery timeout.” Not a protocol mismatch. The bridge never sees the camera’s responses because Docker’s bridge networking drops them.

Result: Front porch camera in the Home app with live view. No HKSV recording yet — that’s a Scrypted setup, but the live view alone closed a long-standing gap.

(end Claude-composed stuff)

The Fix

Claude suggested replacing the ports: block in docker-compose with network_mode: host. Apparently network_mode: host puts the container directly in the host’s network namespace, eliminating the NAT layer entirely. UDP responses now reach the bridge directly. Again, we are way beyond my pay grade here.

And I got live video. From there, it was pretty simple to get the RTSP stream into HomeKit (again, using Claude’s guidance). We’re back into stuff I’ve done before (installing/configuring Homebridge plugins) so I felt more comfortable following the instructions at this point. I tossed the live source URL and stillImageSource URL into homebridge-camera-ffmpeg directly. From there, Homebridge passed it to HomeKit just like any other camera (with the custom setup code that shows up in the terminal logs, just like the Ring plugin).

(For now, the camera does live view only, with no HomeKit Secure Video. For that I would need to figure out Scrypted, but I think I’m happy to leave it here. Maybe someday).

Mission accomplished. This camera will stick around my house much longer now that it is just a dumb pipe for video. I’d like to keep fiddling with it to see if I can tame the network traffic of all my cameras and keep it from phoning home altogether, but we’ll see if AI has anything to say about that.

Speaking of AI

I haven’t published anything of note about my current stance towards AI. To read my front page, you might think I’m 100% opposed to AI, but like all things, it’s complicated (FWIW, that statement is much. more about the hoovering up of content).

Obviously, I used an AI large language model to get here, specifically: Claude. I couldn’t have done this without AI, let’s be clear. For reading documentation, technical specifications, and diagnosing Linux network packets, Claude is a much, much more capable tool than my own Google-fu and limited knowledge.

I’m not a command-line novice, but I’m also not a command-line junky. I’m willing to copy-paste commands as needed, and I can usually tell enough to make sure they’re okay-ish. Obviously an errant flag or parameter could screw things up, but in most cases Claude is just pulling these commands straight from documentation that is inscrutable to me. And for messing around on a Pi? I’m not concerned with nuking a setup if things go awry.

What would have taken me weeks of opening tickets on various open-source projects and annoying real developers with dumb questions took me one night of hacking and troubleshooting on my Pi. Honestly, I got a lot more help this way than I would’ve bothering people who know way more than me on a community Discord, or something. I learned. Isn’t that what the Raspberry Pi is for?

Accomplishing this gave me such a rush. It works! I’ve found a lot of personal utility for large language models in diagnosing my homelab stuff. This isn’t the first time, either: I wouldn’t have gotten Immich running (inside a VM inside Windows 11 after my linux box failed, to boot) at all without AI’s help.

I am happy to give an LLM the credit for this. Sure, I prompted it, but it was an actually-useful tool for troubleshooting and diagnosing. I’m a web developer without much hard Linux/network knowledge. I was able to use what the LLM output to create something that, as far as I can tell, hasn’t been done before. That’s awesome.

I have more to write about AI use, specifically in the field of web development (which I work in). I’m of two minds about a lot of it, and I have some thoughts I need to get down. For the purposes of this project? I’m thrilled that I was able to fix something that’s been bothering me. Something of value was created here. This wouldn’t have happened if I had been on my own.

Credits

This was tested on one Bulb Cam with one firmware version. YMMV. But hopefully this helps the next person looking to connect this camera to their HomeKit (except I have this site excluded from LLM crawling and Google crawling — you see what an impossible tension individuals live in these days? I should probably sacrifice this post to the maw of C O N T E N T in the name of helping out others who have the same problem as me, right? Pass it on? Who knows. I have more thoughts there but this post is long enough already).