<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Notes]]></title><description><![CDATA[Real-world leadership and tech lessons, designed to help you build a Staff-level career.]]></description><link>https://notes.roydon.dev</link><generator>Substack</generator><lastBuildDate>Fri, 03 Apr 2026 18:56:24 GMT</lastBuildDate><atom:link href="https://notes.roydon.dev/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Roydon T]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[info@roydon.dev]]></webMaster><itunes:owner><itunes:email><![CDATA[info@roydon.dev]]></itunes:email><itunes:name><![CDATA[Roydon]]></itunes:name></itunes:owner><itunes:author><![CDATA[Roydon]]></itunes:author><googleplay:owner><![CDATA[info@roydon.dev]]></googleplay:owner><googleplay:email><![CDATA[info@roydon.dev]]></googleplay:email><googleplay:author><![CDATA[Roydon]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Requirement Analysis, infinite scroll Challenge, On-device inference]]></title><description><![CDATA[Sunday Newsletter]]></description><link>https://notes.roydon.dev/p/requirement-analysis-infinite-scroll</link><guid isPermaLink="false">https://notes.roydon.dev/p/requirement-analysis-infinite-scroll</guid><dc:creator><![CDATA[Roydon]]></dc:creator><pubDate>Sun, 16 Nov 2025 01:29:36 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!sok5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Welcome to the first issue of <em>my notes</em>.</p><p>My goal is to share my raw, in-the-trenches notes as a Head of Engineering. Each week, you&#8217;ll get one high-level insight from my &#8220;Work Journal&#8221; (on leadership, product, and strategy) and one deep-dive from my &#8220;Lab Notebook&#8221; (on code, tools, and architecture).</p><p>My hope is that this helps you navigate your own career, whether you&#8217;re moving from Senior to Staff or from Lead to Manager.</p><div><hr></div><h3>A Quick Thought on Requirements</h3><p>I was reminded this week of a simple truth: <strong>Incomplete and vaguely worded requirements are the major source of reworks</strong>, endless alignment meetings, and potential bugs.</p><p>Even though it takes time and is difficult to complete the requirement analysis, that upfront work will save you (and your team) days of future frustration</p><div><hr></div><h3>Your Interview &#8220;No-Hustle&#8221; Tip</h3><p>Your interviewer is not just evaluating your coding ability; they are simulating what it&#8217;s like to <strong>work with you on a difficult problem</strong>.</p><p><strong>Narrate your thought process, </strong><em><strong>especially</strong></em><strong> when you are stuck or unsure.</strong></p><ul><li><p><strong>Before You Code :</strong> Don&#8217;t just dive in. Spend 30 seconds narrating your plan. Say, &#8220;Okay, I see the problem. It looks like I need to fetch data and then handle filtering. I&#8217;ll start by scaffolding the data-fetching logic. I&#8217;m assuming the API is paginated, so I&#8217;ll need to account for that. Is that a correct assumption?&#8221;</p><p><em>This proves you can handle ambiguity and make trade-offs.</em></p></li><li><p><strong>When You Get Stuck :</strong> Do not go silent. This is the most critical moment. Narrate your debugging. Say, &#8220;This is failing. I expected this function to return an array, but it&#8217;s returning <code>null</code>. Let me trace back... Ah, I see. I made a mistake in the <code>if</code> condition. The root cause is here. I&#8217;ll fix that.&#8221;</p><p><em>This proves you are a stable, logical operator who can recover from failure.</em></p></li></ul><p>Your goal isn&#8217;t to be perfect. Your goal is to be a <strong>great collaborator</strong>. A perfect coder who goes silent is a worse teammate than a good coder who communicates clearly.</p><div><hr></div><p>I created first weekly challenge based on Reactjs (Web Development).<br>Challenge 1: The &#8220;Infinite Scroll&#8221; Image Gallery</p><p>This challenges your understanding of the DOM, Refs, and efficient list rendering.</p><ul><li><p><strong>The Task:</strong> Build a photo gallery that fetches data from a public API (like Unsplash or Lorem Picsum) and loads more items as the user scrolls to the bottom.</p></li><li><p><strong>Key Concepts:</strong> <code>useEffect</code>, <code>useRef</code>, <code>IntersectionObserver API</code>, and cleanup functions.</p></li><li><p><strong>The &#8220;Intermediate&#8221; Twist:</strong></p><ol><li><p>Implement a <strong>virtualized list</strong> (windowing) manually or using a library to ensure the DOM doesn&#8217;t crash when 1,000 items are loaded.</p></li><li><p><strong>Preserve scroll position:</strong> If the user clicks an image to view details and hits &#8220;Back&#8221;, they should return to the exact same scroll position.</p></li></ol></li></ul><p>I will upload these challenges to <code>my website </code></p><div><hr></div><h3>I Tried to Run an LLM on My Phone</h3><p>We&#8217;ve all seen the &#8220;magic&#8221; of on-device AI. It&#8217;s the key to apps that are blazing fast, work offline, and actually respect user privacy (since the data never leaves the phone).</p><p>But there&#8217;s a huge gap between <em>knowing</em> this and <em>doing</em> this.</p><p>The core problem is simple: &#8220;How do I make a large, slow, power-hungry model small, fast, and efficient enough to run on a user&#8217;s phone without killing their battery?&#8221;</p><p>I decided to find out by embedding a text-gen model (Google&#8217;s Gemma) into a Kotlin app. I thought I&#8217;d spend all my time in Python converting models. But I was wrong.</p><p>Here&#8217;s the story of what I learned, and the &#8220;Aha!&#8221; moment that lets you skip 90% of the work.</p><h3>The Plan vs. The Reality</h3><p>My plan seemed simple:</p><ol><li><p>Choose a model (Gemma 2B, open-source and good for on-device).</p></li><li><p>Set up my Python environment (<code>pip install tensorflow keras-nlp</code>).</p></li><li><p>Download the model.</p></li><li><p>Convert it to a small TFLite file.</p></li><li><p>Build the app.</p></li></ol><p>This plan failed at step 3.</p><h3>&#8220;Aha!&#8221; Moment #1: The Hugging Face &#8220;Vanishing&#8221; Files</h3><p>I ran the CLI command to get the model: <code>hf download google/gemma-2b-it-tflite</code></p><p>It took 15 minutes. And then... nothing. The files weren&#8217;t in my project folder.</p><p>After some digging, I found the &#8220;Aha!&#8221; moment: The <code>hf</code> CLI saves files to a <strong>global cache</strong>, not your local directory.</p><p><strong>Pro-Tip:</strong> If you want files in your project, you <em>must</em> use the <code>--local-dir</code> flag: <code>hf download google/gemma-2b-it-tflite --local-dir ./models/</code></p><p>But this led to the <em>real</em> problem. The model was over 1GB. Even a &#8220;tiny&#8221; Gemma model I found was 150MB. This is still way too big for an app, and I hadn&#8217;t even <em>started</em> the hard part: figuring out the tokenizer, the inference loop, and the conversion process.</p><p>This is the part where most of us would quit.</p><h3>&#8220;Aha!&#8221; Moment #2: The Magic <code>.task</code> File</h3><p>I was stuck. I didn&#8217;t want to become a full-time model conversion engineer. I just wanted to build an app.</p><p>And that&#8217;s when I found the <em>real</em> secret: <strong>The MediaPipe LlmInference API.</strong></p><p>Here&#8217;s the entire &#8220;Aha!&#8221; moment: You don&#8217;t need to manually manage the tokenizer, the TFLite conversion, and the text generation. The MediaPipe library does it all.</p><p>But it needs a special kind of file. You don&#8217;t use the raw Hugging Face model. You need a pre-converted <code>.task</code> file.</p><p>I found this model: <code>litert-community/gemma-3-270m-it</code></p><p>The file <code>gemma3-270m-it-q8.task</code> (304 MB) is a special bundle that <strong>already contains everything:</strong></p><ul><li><p>The quantized TFLite model</p></li><li><p>The tokenizer</p></li><li><p>All the necessary configuration</p></li></ul><p><strong>This means you get to skip the entire Python conversion and quantization nightmare.</strong> You just download this one file.</p><h3>The &#8220;Last Mile&#8221;: Building the App</h3><p>This one file changed everything. The problem was no longer &#8220;How do I convert a model?&#8221; It was &#8220;How do I get this 304MB file into my app?&#8221;</p><p>It&#8217;s still too large for the app&#8217;s built-in <code>assets</code> folder. The solution? Push it to the device&#8217;s local storage manually using <code>adb</code>.</p><ol><li><p>Connect your phone (with USB debugging on).</p></li><li><p>Run these commands to create a folder and push the file:</p></li></ol><p><code># Create the directory on your phone</code></p><p><code>adb shell &#8220;mkdir -p /data/local/tmp/llm&#8221;</code></p><p><code># Push the .task file</code></p><p><code>adb push /path/to/your/gemma3-270m-it-q8.task /data/local/tmp/llm/</code></p><ol start="3"><li><p>Now, in your Kotlin Compose app, just add the dependency:</p></li></ol><p><code>// build.gradle.kts</code></p><p><code>implementation(&#8221;com.google.mediapipe:tasks-genai:0.10.11&#8221;)</code></p><ol start="4"><li><p>And give your app permission to read it:</p><p><code>&lt;!-- AndroidManifest.xml --&gt;</code></p><p><code>&lt;uses-permission android:name=&#8221;android.permission.READ_EXTERNAL_STORAGE&#8221;</code></p><p><code>     android:maxSdkVersion=&#8221;32&#8221; /&gt;</code></p></li></ol><p>And that&#8217;s it. My app can now find and access the model file. The hardest part of the journey was over.</p><p>After some tweaking of UI layout and overlapping issue, this is the final Screenshot</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!sok5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!sok5!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg 424w, https://substackcdn.com/image/fetch/$s_!sok5!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg 848w, https://substackcdn.com/image/fetch/$s_!sok5!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!sok5!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!sok5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg" width="714" height="1599" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1599,&quot;width&quot;:714,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!sok5!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg 424w, https://substackcdn.com/image/fetch/$s_!sok5!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg 848w, https://substackcdn.com/image/fetch/$s_!sok5!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!sok5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F100f6bed-5cda-4fc1-b25f-40e2f4f4b895_714x1599.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>My Key Takeaways:</h3><ul><li><p><strong>Stop! Don&#8217;t convert.</strong> Your first step shouldn&#8217;t be <code>pip install</code>. It should be finding a pre-converted <code>.task</code> file for MediaPipe.</p></li><li><p><strong>The </strong><code>.task</code><strong> file is the real &#8220;shortcut.&#8221;</strong> It bundles the model, tokenizer, and config, skipping the hardest part of on-device AI.</p></li><li><p><code>adb push</code><strong> is your best friend.</strong> For any model over ~100MB, don&#8217;t bother with app assets. Just push it directly to device storage for testing.</p></li></ul><p>Next time, I&#8217;ll share the Kotlin code to actually initialize the <code>LlmInference</code> client and start chatting with the model.</p><div><hr></div><p></p>]]></content:encoded></item></channel></rss>