sidcodehttps://sidcode.github.io/2021-12-21T04:32:00+08:00sidcode2021-12-21T04:32:00+08:002021-12-21T04:32:00+08:00Siddhant Shrivastavatag:sidcode.github.io,2021-12-21:/letters/sidcode/<p>I’m Sid / <em>sidcode</em> / Siddhant. My name means <strong>principles / tenets</strong>. That is probably why I was naturally drawn to the <a href="proxy.php?url=https://kernel.community/en/learn/">Kernel syllabus</a> ;) Soon after my block wrapped up for me as a Fellow, I was asked if I wanted to be a Steward. I accepted this gift of a responsibility …</p><p>I’m Sid / <em>sidcode</em> / Siddhant. My name means <strong>principles / tenets</strong>. That is probably why I was naturally drawn to the <a href="proxy.php?url=https://kernel.community/en/learn/">Kernel syllabus</a> ;) Soon after my block wrapped up for me as a Fellow, I was asked if I wanted to be a Steward. I accepted this gift of a responsibility to best map out this unparalleled community of care that is building the heart of our interdependent web. This post is an attempt at mapping my meaningful meanders. I’m <a href="proxy.php?url=https://blog.ncase.me/evolution-not-revolution/">evolving</a> just like our society evolves with its interdependent pillars- <img alt="Loopy Society" src="proxy.php?url=https://sidcode.github.io/images/articles/2021/opensourceloopy.gif"> This post maps out my life among each of web3, technology, culture, policy, economics, and personal aspects. This post has been salted and peppered to perfection with links for you to dig deeper into all these rabbitholes. Here we go- </p> <h2>Web3 journey so far:</h2> <ul> <li><strong>First foray</strong>: introduced to build on <a href="proxy.php?url=https://ipfs.io/">IPFS</a> in 2016 and stumbled upon Juan Benet <sup id="sf-sidcode-1-back"><a href="proxy.php?url=#sf-sidcode-1" class="simple-footnote" title=" Unrelated: I was invited to speak at the same symposium at Stanford 3 years later. Here’s the link to that talk :) ">1</a></sup> (accidentally). Read the <a href="proxy.php?url=https://fermatslibrary.com/s/bitcoin">annotated Bitcoin paper</a> too but did not grok the social consequences fully until much later.</li> <li> <p><strong>Academic</strong>: researching smart contracts, <a href="proxy.php?url=https://docs.google.com/presentation/d/1mYL72VPl5AY6VyzL1v70c9AHYeqhsRDGAYRgAYwTqR4/edit?usp=sharing">memes</a>, communities, and NFTs via <a href="proxy.php?url=https://www.initc3.org/">IC3, Cornell University</a>. </p> </li> <li> <p><strong>NFTs</strong> : At IC3, we (Sarah Allen, Prof. Ari Juels, Prof. Mukti Khaire, Tyler, and me) published <a href="proxy.php?url=https://twitter.com/initc3org/status/1518627597500010497">a pretty extensive Primer on the past, present, and future of NFTs</a>. I <a href="proxy.php?url=https://www.linkedin.com/posts/sohwanwei_the-nft-mania-ugcPost-6923613725593677824-JvZL">was also interviewed</a> for <strong>NFT Mania</strong> by Wan Wei.</p> </li> <li><strong>DAOs</strong>: GitcoinDAO, Token Engineering Commons, SuperTeam, Algovera, ScienceFund, Verses</li> <li> <p><strong>GitcoinDAO</strong>: I’ve been fortunate to be a Steward at <a href="proxy.php?url=https://gitcoin.co/">GitcoinDAO</a> and help build sustainable rewards systems for building the open web. I started as an early contributor of the <a href="proxy.php?url=https://moonshotcollective.space/">Moonshot Collective</a>, helped on web3’s first ever <a href="proxy.php?url=https://newsletter.banklesshq.com/p/the-ultimate-dao-report">ultimate DAO workers report</a>, tech lead for <a href="proxy.php?url=https://m-m-m.xyz/">MMM</a> (memes, merch, marketing), a librarian at the <a href="proxy.php?url=https://gitcoin.co/blog/seeking-a-new-kind-of-public-good/">Public Goods</a> workstream. <em>Public Goods are good</em> and <em>It is all coordination</em>.</p> </li> <li> <p><strong>Governauts</strong>: researching reward systems for the next wave of DAO coordination on a governauts scholarship with <a href="proxy.php?url=https://tokenengineeringcommunity.github.io/website/docs/academy-welcome/#dao-rewards-systems">Token Engineering Commons</a>. <a href="proxy.php?url=https://docs.google.com/presentation/d/13E57_XoqaKc47Gq2_iu6ze9zHLKbBKxxF-yT-EBUVvw/edit#slide=id.g11640895d61_0_12">Here’s a talk I gave on CommonsWTH</a> - measuring the wealth, truth, and health of our commons.</p> </li> <li><strong>Projects</strong>: <a href="proxy.php?url=https://gitcoin.co/grants/4151/silentauth-the-de-facto-multi-factor-authenticator">Silent Auth</a>, <a href="proxy.php?url=https://gitcoin.co/grants/4307/deschool">DeSchool</a>, contributor to <a href="proxy.php?url=https://memes.party">memes.party</a>, Declaration of <a href="proxy.php?url=https://www.interdependence.online/declaration">Interdependence of Cyberspace</a>, The Truth Wins - <a href="proxy.php?url=https://www.linkedin.com/pulse/truth-wins-campaign-press-freedom-initiative-partnership-/">a censorship-resistance project</a> with Reporters Without Borders, DDB, and Hyperinteractive Studios.</li> <li><strong>Hackathons</strong>: won the first prize and popularity prize at the <a href="proxy.php?url=https://sbic2021.sbip.sg/">Singapore Blockchain Innovation Challenge</a> in December 2021 for cross-chain community authentication for DAOs, won the <a href="proxy.php?url=https://twitter.com/balajis">Balaji Srinivasan</a> Prize at the <a href="proxy.php?url=https://buildingoutloud.solana.com/">Solana India hackathon</a> in September 2021, a <a href="proxy.php?url=https://devpost.com/software/charpie">special prize</a> at HackZurich 2021, and the overall prize at <a href="proxy.php?url=https://www.globaltechchallenge.com/">National Blockchain Challenge</a> in 2019 in Singapore. The real prize in all of these are my friends whom I got to spend those hours with, and the new friends I made subsequently! I’ve also judged quite a few hackathons during this journey and am finding ways to reimagine and improve how hackathons are designed. In 2022, I am helping find a <em>better</em> way of building impactful hackathons with <strong>better.sg</strong> - a tech4good non-profit in Singapore. Hackathons should not be a winner-take-all.</li> </ul> <h2>Tech / (Computer) Science:</h2> <ul> <li><strong>Security</strong>: <a href="proxy.php?url=https://www.youtube.com/watch?v=omR7R4wvK8g">researching, speaking, writing</a>, and building cyber-physical security systems for national critical infrastructures at the iTrust Centre for Research in Cyber Security, Singapore University of Technology and Design</li> <li><strong>Space</strong>: satellite image processing and remote sensing for the Indian Space Research Organization. My first foray into free and open source software. Also worked on astronaut-rover interactions with Italian Mars Society that liaises with the European Space Agency (described below in <em>Robotics</em>)</li> <li><strong>Distributed Systems</strong>: and Engineering at one of the best systems teams at Goldman Sachs</li> <li><strong>Robotics</strong>: Worked extensively with mobile robot teams and rovers via Google Summer of Code, Italian Mars Society and Python Software Foundation. <a href="proxy.php?url=https://www.cyber-physical.space/categories/gsoc/">Blog link</a></li> <li><strong>Extended Reality</strong>: designing explainable security and smart cities for <a href="proxy.php?url=https://www.lun.com/Pages/NewsDetail.aspx?dt=2018-11-16&amp;PaginaId=28&amp;bodyid=0">policymakers through VR/AR/MR</a>.</li> <li><strong>Papers and Patents</strong>: despite their pitfalls, centralized institutions have assembled our knowledge graph reasonably well so far for <a href="proxy.php?url=https://scholar.google.co.in/citations?view_op=list_works&amp;hl=en&amp;hl=en&amp;user=u4_lYmUAAAAJ&amp;alert_preview_top_rm=2&amp;scilu=&amp;scisig=AMD79ooAAAAAYcuboWhjqQnNuAekCyEzQAh41NZgqjPs&amp;gmla=AJsN-F7zSFeadarueSUnSgjOWEfAVKXiomWos9f5D1MtTwXGNlTzmFLFD_T0egP6onwPcDxqU4q2dT4R5YZHs7SpA2_hz1yg7vC7ijKUIZBo9ad327pKksI&amp;sciund=15946461532337949835">getting one’s ideas out there</a>. It is also important to build and explore decentralized infrastructure for science (DeSci). An early attempt at that <a href="proxy.php?url=https://mirror.xyz/docxology.eth/6NTVACMLec12YdZGro_DEIf5mLOkNKKgfKCdToB0UjA">is this DeSci paper</a> that a bunch of us collectively co-authored to help wayfinders figure out this dichotomy. This is a first in many aspects. I publish in the fields of security, critical infrastructures, cyber-physical systems, decentralized systems, artificial intelligence, big data, science, education, and training. I re-cognize that there are a few buzzwords in there so I implore you to reach out to me to gauge how superficial/deep my knowledge is in each of these fields (and more). Re-search is its own wonder-full reward.</li> <li><strong>Education</strong>: I <a href="proxy.php?url=https://www.youtube.com/watch?v=tIc3B9CCHQQ&amp;list=PLHow0r_Oae-9CxD4Ca5HYwxtY15IyAp6W&amp;index=9">gave a talk</a> on established and emerging aspects in Education (including how cyber-physical campuses and metaversal learning unfolds). My friends and I are helping build the grassroots infrastructure for better schooling that goes beyond conventional EdTech companies built around capturing the attention, retention, and intention economy. These aspects are <a href="proxy.php?url=https://coda.io/d/DeSchool_dRavbDJfPhH/Overview_suOuD">all covered in DeSchool</a>.</li> </ul> <h2>Economics, Policy:</h2> <ul> <li><strong>Cyber-physical security</strong>: actively involved in co-organizing and contributing to global and local cyber-physical security exercises- like NATO’s <a href="proxy.php?url=https://ccdcoe.org/exercises/locked-shields/">Locked Shields</a> and <a href="proxy.php?url=https://itrust.sutd.edu.sg/ciss/ciss-2021-ol/">Critical Infrastructure Security Showdown</a>. I <a href="proxy.php?url=https://www.youtube.com/watch?v=JJ5xZvcEE1I&amp;t=1826s">gave a talk about it here</a> at CyCon, a NATO conference.</li> <li><strong>Startup and Venture life</strong>: helping build more secure and usable authentication systems for anything, advising companies, building teams for companies at all stages.</li> <li><strong>Pseudonymous work</strong>: editor of the sequel to a top Bitcoin and economics book that you may have most likely read.</li> <li><strong>Banking and the World Economy</strong>: learned a lot about private/public investment management, banking, securities, and markets at my first job as a strategic analyst at Goldman Sachs. I keep learning the rest from many different legendary voices on different sides of the table (Lyn Alden, Ray Dalio, Rob Breedlove, Raghuram Rajan, etc).</li> <li><strong>Consulting</strong>: advising policymakers and companies who are building the future of society in different aspects.</li> </ul> <h2>Community (local):</h2> <ul> <li><a href="proxy.php?url=https://t.me/+eFnK4ZqSv-gxYWRl">DesignSG</a> | founder and co-admin for a community of 3100+ designers.</li> <li><a href="proxy.php?url=https://t.me/joinchat/8dfMVX-gYQlhMDU9">CryptoTechSG</a> | founder and co-admin for a local community of crypto practitioners. Fun fact: I made Vitalik <a href="proxy.php?url=https://twitter.com/sidcode_/status/1443202231293992960?s=20">give a shoutout</a> to the community in one his AMAs.</li> <li><a href="proxy.php?url=https://better.sg">Better.sg</a> | volunteering as a project co-lead and builder in multiple projects <a href="proxy.php?url=https://projects.better.sg">listed here</a> on <a href="proxy.php?url=https://saylah.sg">non-verbal accessibility</a>, safety, and <a href="proxy.php?url=https://tobeyou.sg/">multicultural issues via gaming and interactive fiction</a>.</li> <li>Other Singapore-based groups: <a href="proxy.php?url=https://t.me/joinchat/a62YbPmEllxhMDll">ProductSG - 1100+ members</a>, GatherSG, Creative Technologists </li> <li>India-based groups: Climbing (<a href="proxy.php?url=https://www.youtube.com/watch?v=gGw6QyDipkI">Bangalore Climbing Initiatives</a>) and Skating (<a href="proxy.php?url=https://www.instagram.com/bengaluruskaters/?hl=en">Bengaluru Skaters</a>)</li> </ul> <h2>Community (international)</h2> <ul> <li><a href="proxy.php?url=https://1729.com">1729</a> | Contributor to DAO and Security discussions, hosting Singapore events.</li> <li>Steward at <a href="proxy.php?url=https://kernel.community">Kernel</a> and <a href="proxy.php?url=https://gitcoindao.com">Gitcoin</a>.</li> <li><a href="proxy.php?url=https://getdweb.net/#nodes">DWeb</a> | Helping steward the Singapore and Bengaluru nodes.</li> <li><a href="proxy.php?url=https://superteam.fun">Superteam</a> member | not as active as I’d like</li> <li><a href="proxy.php?url=https://www.complexityweekend.com/">Complexity Weekend</a> | Facilitator, eusocial participAnt, and co-organizer.</li> <li><a href="proxy.php?url=https://www.activeinference.org/">Active Inference Lab</a> | curious student of this emerging field and have contributed <a href="proxy.php?url=https://mirror.xyz/docxology.eth/6NTVACMLec12YdZGro_DEIf5mLOkNKKgfKCdToB0UjA">to a DeSci paper</a> and a <a href="proxy.php?url=https://www.youtube.com/watch?v=Dl6v-3COgCo">robotics livestream</a>.</li> </ul> <h2>Culture:</h2> <ul> <li><strong>Poetry</strong>: expressing computing culture by translating classical Urdu/Hindi poetry to English at <a href="proxy.php?url=https://twitter.com/CodeShayari/">CodeShayari</a>. <a href="proxy.php?url=https://docs.google.com/document/d/1neMrjcQ2vjJAl7ZE02JlV6wELbxuirITpdQl_Ti1spk/edit#heading=h.zcwjgo9spjww">Selected works</a>.</li> <li><strong>Design</strong>: MIT Media Lab <a href="proxy.php?url=https://www.indiainitiative.mit.edu/">design innovation workshop</a>. I designed <a href="proxy.php?url=https://www.youtube.com/watch?v=K5bR0ndroVk">Smart Textiles</a>.</li> <li><strong>Storytelling</strong>: bringing stories to code and pull requests (pseudonymoyus stealth project, to be released soon)</li> <li><strong>Humanities</strong>: involved in multiple communities studying forgotten/current/emerging cultures (around the world in cyber-physical spaces)</li> </ul> <h2>Personal:</h2> <ul> <li><strong>Physical Culture</strong>: amateur bodybuilding, strength training, calisthenics, martial arts (Kalaripayattu), and aiming for the foundational <a href="proxy.php?url=http://journal.crossfit.com/2002/04/foundations.tpl">domains of fitness</a> </li> <li><strong>Longevity</strong>: been a training/nutrition/sleep geek since before it was cool</li> <li><strong>Food</strong>: Love everything and anything about the form and function of food of all kinds</li> <li><strong>Generalist</strong>: very putty-like | interested in many things</li> <li><strong>Sports</strong>: climbing, skating, football, frisbee, dragon boating, swimming, and soon table tennis (thanks to a friend)</li> <li><strong>Dance</strong>: enthusiast (about music in general)</li> <li><strong>Filmmaking</strong> and combining hobbies- like <a href="proxy.php?url=https://www.youtube.com/watch?v=gGw6QyDipkI">this one</a> where I combined climbing, photography, and film production. <a href="proxy.php?url=https://www.youtube.com/watch?v=4zUsWyM5NFA">Here’s the link to the full documentary</a> which was showcased at a few film festivals.</li> </ul> <h2>Upbringing:</h2> <ul> <li>90s kid</li> <li>a torchbearer of <a href="proxy.php?url=https://twitter.com/sidcode_/status/1448585694041960448?s=20">the relay generation</a>, the last one to be proficient with both analog and digital worlds</li> <li>mixed culture kid (genotypically and phenotypically)</li> <li>moved around quite a bit in different geographies across continents - coast, desert, valley, mountain, wetland, island, and soon in a network state (maybe).</li> <li>used to sing as a kid in the Christmas choir and Indian Classical Music. Then life happened. Now it is a plan for later.</li> </ul> <h2>Socials:</h2> <ul> <li><a href="proxy.php?url=https://twitter.com/sidcode_">@sidcode on Twitter</a> </li> <li><a href="proxy.php?url=https://linkedin.com/in/sidcode">@sidcode on LinkedIn</a></li> <li>sidcode#1729 on Discord</li> </ul> <p>I’m gradually giving up on all other social networks before they give themselves up. Once bitten, twice shy. Looking at you, G+ :) </p> <p>Thanks for reading! If you’ve read this far, ping me and <a href="proxy.php?url=https://www.kernel.community/en/learn/module-7/the-gift/">I’ll gift</a> you <a href="proxy.php?url=https://poap.xyz/">my personal POAP</a> ;)</p> <p><em>Fin</em></p><ol class="simple-footnotes"><li id="sf-sidcode-1"> Unrelated: I was <a href="proxy.php?url=https://www.youtube.com/watch?v=omR7R4wvK8g">invited to speak</a> at the same symposium at Stanford 3 years later. <a href="proxy.php?url=https://youtu.be/omR7R4wvK8g">Here’s the link</a> to that talk :) <a href="proxy.php?url=#sf-sidcode-1-back" class="simple-footnote-back">↩</a></li></ol>Meme Museum2021-12-15T20:50:00+08:002021-12-15T20:50:00+08:00Siddhant Shrivastavatag:sidcode.github.io,2021-12-15:/letters/meme-museum/<h1>https://memes.party is live!</h1> <p>At every Memepalooza, hundreds of memes per month are produced. The themes span across public goods, web3, the open internet and the other values Gitcoin and GitcoinDAO cares about. Right now these memes get lost and forgotten as new memes pop up. We MUST curate …</p><h1>https://memes.party is live!</h1> <p>At every Memepalooza, hundreds of memes per month are produced. The themes span across public goods, web3, the open internet and the other values Gitcoin and GitcoinDAO cares about. Right now these memes get lost and forgotten as new memes pop up. We MUST curate them. Enter Meme Museum 🤡</p> <p>The Meme museum aims to help fix this by celebrating great memes and great meme artists by curating memes, displaying them, telling stories about them, and making it easy for anyone to visit the digital museum, and share them to help push forward the spirit of GitcoinDAO.</p> <p>Memes as xDAI NFTs ? Also helps with onboarding web2 memelords to web3 :)</p> <p>Memepalooza participants also seemed hyped about this, but that is a biased audience 😂</p> <p>The problem we faced: memes get lost and underappreciated in the discord UX. They should be out in the open and not in some walled garden. </p> <p>Solution - a web3 museum of memes where the best memes and memelords get appreciated with votes and tips</p> <h1>Version 1 of the Meme Museum:</h1> <ul> <li>View memes on the frontpage - sorted by upvotes, daily meme (knowyourmeme style)</li> <li>Sign in with web3 to upvote/downvote/upload</li> <li>Upload memes in image formats (.jpg/.png)</li> <li>Adding stories to every meme&rsquo;s page (knowyourmeme)</li> <li>Crediting it to possible OG meme artists who created that meme first</li> </ul>My Kernel Intro - sidcode2021-09-24T04:32:00+08:002021-09-24T04:32:00+08:00Siddhant Shrivastavatag:sidcode.github.io,2021-09-24:/letters/my-kernel-intro-sidcode/<p>Hello <em>Kernel</em>! My name is Sid / <em>sidcode</em> / Siddhant. It means <strong>principles / tenets</strong>. That is probably why I was naturally drawn to the <a href="proxy.php?url=https://kernel.community/en/learn/">syllabus</a> ;)</p> <p><strong>Goals</strong> - listening as much as possible, vibing<sup id="sf-my-kernel-intro-sidcode-1-back"><a href="proxy.php?url=#sf-my-kernel-intro-sidcode-1" class="simple-footnote" title="And to be one of the best contributors in our memepool.">1</a></sup>, and building <strong>together, for a long long time</strong>! </p> <p><strong>Tracks that I’ve been tracking</strong> - Security, Fair Launch, Gaming, DeFi …</p><p>Hello <em>Kernel</em>! My name is Sid / <em>sidcode</em> / Siddhant. It means <strong>principles / tenets</strong>. That is probably why I was naturally drawn to the <a href="proxy.php?url=https://kernel.community/en/learn/">syllabus</a> ;)</p> <p><strong>Goals</strong> - listening as much as possible, vibing<sup id="sf-my-kernel-intro-sidcode-1-back"><a href="proxy.php?url=#sf-my-kernel-intro-sidcode-1" class="simple-footnote" title="And to be one of the best contributors in our memepool.">1</a></sup>, and building <strong>together, for a long long time</strong>! </p> <p><strong>Tracks that I’ve been tracking</strong> - Security, Fair Launch, Gaming, DeFi, Culture, and of course Learn.</p> <h2>Web3 journey so far:</h2> <ul> <li><strong>First foray</strong>: introduced to build on <a href="proxy.php?url=https://ipfs.io/">IPFS</a> in 2016 and stumbled upon Juan Benet <sup id="sf-my-kernel-intro-sidcode-2-back"><a href="proxy.php?url=#sf-my-kernel-intro-sidcode-2" class="simple-footnote" title=" Unrelated: I was invited to speak at the same symposium at Stanford 3 years later ">2</a></sup> (accidentally). Read the <a href="proxy.php?url=https://fermatslibrary.com/s/bitcoin">annotated Bitcoin paper</a> too but did not grok the social consequences fully until much later.</li> <li><strong>Academic</strong>: researching smart contracts, <a href="proxy.php?url=https://docs.google.com/presentation/d/1mYL72VPl5AY6VyzL1v70c9AHYeqhsRDGAYRgAYwTqR4/edit?usp=sharing">memes</a>, communities, and NFTs via <a href="proxy.php?url=https://www.initc3.org/">IC3, Cornell University</a>.</li> <li><strong>DAOs</strong>: GitcoinDAO, Token Engineering Commons, SuperTeam, Algovera, ScienceFund, Verses</li> <li><strong>GitcoinDAO</strong>: started in <a href="proxy.php?url=https://moonshotcollective.space/">Moonshot Collective</a>, helped on the <a href="proxy.php?url=https://newsletter.banklesshq.com/p/the-ultimate-dao-report">ultimate DAO workers report</a>, tech coordinator for <a href="proxy.php?url=https://m-m-m.xyz/">MMM</a> (memes, merch, marketing), a librarian at the <a href="proxy.php?url=https://gitcoin.co/blog/seeking-a-new-kind-of-public-good/">Public Goods</a> workstream. <em>Public Goods are good</em>.</li> <li><strong>Governauts</strong>: researching reward systems for the next wave of DAO coordination on a governauts scholarship with <a href="proxy.php?url=https://tokenengineeringcommunity.github.io/website/docs/academy-welcome/#dao-rewards-systems">Token Engineering Commons</a>.</li> <li><strong>Projects</strong>: <a href="proxy.php?url=https://gitcoin.co/grants/4151/silentauth-the-de-facto-multi-factor-authenticator">Silent Auth</a>, <a href="proxy.php?url=https://gitcoin.co/grants/4307/deschool">DeSchool</a>, contributor to <a href="proxy.php?url=https://memes.party">memes.party</a>, Declaration of <a href="proxy.php?url=https://www.interdependence.online/declaration">Interdependence of Cyberspace</a></li> <li><strong>Hackathons</strong>: won the first prize and popularity prize at the <a href="proxy.php?url=https://sbic2021.sbip.sg/">Singapore Blockchain Innovation Challenge</a> in December 2021 for cross-chain community authentication for DAOs, won the <a href="proxy.php?url=https://twitter.com/balajis">Balaji Srinivasan</a> prize at the <a href="proxy.php?url=https://buildingoutloud.solana.com/">Solana India hackathon</a> in September 2021, a <a href="proxy.php?url=https://devpost.com/software/charpie">special prize</a> at HackZurich 2021, and the overall prize at <a href="proxy.php?url=https://www.globaltechchallenge.com/">National Blockchain Challenge</a> in 2019 in Singapore. The real prize in all of these are my friends whom I got to spend those hours with, and the new friends I made subsequently!</li> </ul> <h2>Tech / (Computer) Science:</h2> <ul> <li><strong>Security</strong>: <a href="proxy.php?url=https://www.youtube.com/watch?v=JJ5xZvcEE1I&amp;t=1826s">cyber-physical security for national critical infrastructures</a> at the iTrust Centre for Research in Cyber Security, Singapore University of Technology and Design</li> <li><strong>Space</strong>: satellite image processing and remote sensing for the Indian Space Research Organization. My first foray into free and open source software.</li> <li><strong>Distributed Systems</strong>: and Engineering at Goldman Sachs</li> <li><strong>Robotics</strong>: Worked extensively with mobile robot teams and rovers via Google Summer of Code, Italian Mars Society and Python Software Foundation. <a href="proxy.php?url=https://www.cyber-physical.space/categories/gsoc/">Blog link</a></li> <li><strong>Extended Reality</strong>: designing explainable security and smart cities for <a href="proxy.php?url=https://www.lun.com/Pages/NewsDetail.aspx?dt=2018-11-16&amp;PaginaId=28&amp;bodyid=0">policymakers through VR/AR/MR</a>.</li> <li><strong>Papers and Patents</strong>: despite their pitfalls, centralized institutions have assembled our knowledge graph reasonably OK so far for <a href="proxy.php?url=https://scholar.google.co.in/citations?view_op=list_works&amp;hl=en&amp;hl=en&amp;user=u4_lYmUAAAAJ&amp;alert_preview_top_rm=2&amp;scilu=&amp;scisig=AMD79ooAAAAAYcuboWhjqQnNuAekCyEzQAh41NZgqjPs&amp;gmla=AJsN-F7zSFeadarueSUnSgjOWEfAVKXiomWos9f5D1MtTwXGNlTzmFLFD_T0egP6onwPcDxqU4q2dT4R5YZHs7SpA2_hz1yg7vC7ijKUIZBo9ad327pKksI&amp;sciund=15946461532337949835">getting one’s ideas out there</a></li> </ul> <h2>Economics, Policy:</h2> <ul> <li><strong>Cyber-physical security</strong>: actively involved in organizing global and local cyber-physical security exercises- like NATO’s <a href="proxy.php?url=https://ccdcoe.org/exercises/locked-shields/">Locked Shields</a> and <a href="proxy.php?url=https://itrust.sutd.edu.sg/ciss/ciss-2021-ol/">Critical Infrastructure Security Showdown</a>.</li> <li><strong>Startup life</strong>: Chief strategy/storytelling officer of <a href="proxy.php?url=https://silencelaboratories.com/">Silence Labs</a>. We’ve made authentication invisible while increasing security. <em>Coming soon to web3.</em></li> <li><strong>Pseudonymous work</strong>: editor of the sequel to a top Bitcoin and economics book that you may have most likely read.</li> <li><strong>Banking</strong>: learned a lot about global Investment Management, Banking, Markets at my first job as a strategic analyst at Goldman Sachs.</li> <li><strong>Consulting</strong>: advising policymakers and companies who’re building the future of society.</li> </ul> <h2>Community:</h2> <ul> <li><a href="proxy.php?url=https://www.complexityweekend.com/">Complexity Weekend</a> | Facilitator and eusocial participAnt.</li> <li><a href="proxy.php?url=https://t.me/joinchat/jTEtN-TDtLc2NDk1">DesignSG</a> | founder and co-admin for a community of 2050+ designers.</li> <li><a href="proxy.php?url=https://t.me/joinchat/8dfMVX-gYQlhMDU9">CryptoTechSG</a> | founder and co-admin for a local community of crypto practitioners. Fun fact: I made Vitalik <a href="proxy.php?url=https://twitter.com/sidcode_/status/1443202231293992960?s=20">give a shoutout</a> to the community in one his AMAs.</li> <li><a href="proxy.php?url=https://better.sg">Better.sg</a> | volunteering as a project co-lead and builder in multiple projects <a href="proxy.php?url=https://projects.better.sg">listed here</a> on <a href="proxy.php?url=https://saylah.sg">non-verbal accessibility</a>, safety, and <a href="proxy.php?url=https://tobeyou.sg/">multicultural issues via gaming and interactive fiction</a>.</li> <li>Other Singapore-based groups: ProductSG, GatherSG, Creative Technologists </li> <li>India-based groups: Climbing (Bangalore Climbing Initiatives) and Skating (Bengaluru Skaters)</li> </ul> <h2>Culture:</h2> <ul> <li><strong>Poetry</strong>: expressing computing culture by translating classical Urdu/Hindi poetry to English at <a href="proxy.php?url=https://twitter.com/CodeShayari/">CodeShayari</a>. <a href="proxy.php?url=https://docs.google.com/document/d/1neMrjcQ2vjJAl7ZE02JlV6wELbxuirITpdQl_Ti1spk/edit#heading=h.zcwjgo9spjww">Selected works</a>.</li> <li><strong>Design</strong>: MIT Media Lab <a href="proxy.php?url=https://www.indiainitiative.mit.edu/">design innovation workshop</a>. I designed <a href="proxy.php?url=https://www.youtube.com/watch?v=K5bR0ndroVk">Smart Textiles</a>.</li> <li><strong>Storytelling</strong>: bringing stories to code and pull requests (pseudonymoyus stealth project, to be released soon)</li> <li><strong>Humanities</strong>: involved in multiple communities studying forgotten/current/emerging cultures (around the world in cyber-physical spaces)</li> </ul> <h2>Personal:</h2> <ul> <li><strong>Physical Culture</strong>: amateur bodybuilding, strength training, calisthenics, martial arts (Kalaripayattu), and all ten foundational <a href="proxy.php?url=http://journal.crossfit.com/2002/04/foundations.tpl">domains of fitness</a> </li> <li><strong>Longevity</strong>: training/nutrition/sleep geek before it was cool</li> <li><strong>Food</strong>: Love everything and anything about the form and function of food of all kinds</li> <li><strong>Generalist</strong>: very putty-like | interested in many things</li> <li><strong>Sports</strong>: climbing, skating, football, frisbee, dragon boating, swimming</li> <li><strong>Dance</strong>: enthusiast (about music in general)</li> <li><strong>Filmmaking</strong> and combining hobbies- like <a href="proxy.php?url=https://www.youtube.com/watch?v=gGw6QyDipkI">this one</a></li> </ul> <h2>Upbringing:</h2> <ul> <li>90s kid</li> <li>a torchbearer of <a href="proxy.php?url=https://twitter.com/sidcode_/status/1448585694041960448?s=20">the relay generation</a>, the last one to be proficient with both analog and digital worlds</li> <li>mixed culture kid (genotypically and phenotypically)</li> <li>moved around quite a bit in different geographies across continents - coast, desert, valley, mountain, wetland, island</li> <li>used to sing as a kid in the Christmas choir and Indian Classical Music. Then life happened. Now it is a plan for later.</li> </ul> <h2>Socials:</h2> <ul> <li><a href="proxy.php?url=https://twitter.com/sidcode_">@sidcode on Twitter</a> </li> <li><a href="proxy.php?url=https://linkedin.com/in/sidcode">@sidcode on LinkedIn</a></li> <li>sidcode#1729 on Discord</li> </ul> <p>All the social networks which I’m gradually giving up on before they give themselves up. Once bitten, twice shy. Looking at you, G+ </p> <p>Thanks for reading! If you’ve read this far, ping me and I’ll send you a POAP ;) <em>Fin</em></p><ol class="simple-footnotes"><li id="sf-my-kernel-intro-sidcode-1">And to be one of the best contributors in our <em>memepool</em>. <a href="proxy.php?url=#sf-my-kernel-intro-sidcode-1-back" class="simple-footnote-back">↩</a></li><li id="sf-my-kernel-intro-sidcode-2"> Unrelated: I was <a href="proxy.php?url=https://www.youtube.com/watch?v=omR7R4wvK8g">invited to speak</a> at the same symposium at Stanford 3 years later <a href="proxy.php?url=#sf-my-kernel-intro-sidcode-2-back" class="simple-footnote-back">↩</a></li></ol>Back to the future2021-02-12T12:12:00+08:002021-02-12T12:12:00+08:00Siddhant Shrivastavatag:sidcode.github.io,2021-02-12:/letters/back-to-the-future/<section> <p>This letter checks if time travel works as desired. Guess what, it does. Let the trolling and trawling through the threads of time begin-</p> <blockquote> <p>Truly thankful to this terrific <em>tenet</em><sup id="sf-back-to-the-future-1-back"><a href="proxy.php?url=#sf-back-to-the-future-1" class="simple-footnote" title="My name means tenet, which also happens to be a 2020 Christopher Nolan movie about principles and time-travel">1</a></sup> technology that’s turbocharging truthful tinkerers through these thrilling time travel tropes. Tenaciously tried; thoroughly tested - these turnstile<sup id="sf-back-to-the-future-2-back"><a href="proxy.php?url=#sf-back-to-the-future-2" class="simple-footnote" title="Red and Blue turnstiles are used in Tenet to travel back and forth the linear timeline ">2 …</a></sup></p></blockquote></section><section> <p>This letter checks if time travel works as desired. Guess what, it does. Let the trolling and trawling through the threads of time begin-</p> <blockquote> <p>Truly thankful to this terrific <em>tenet</em><sup id="sf-back-to-the-future-1-back"><a href="proxy.php?url=#sf-back-to-the-future-1" class="simple-footnote" title="My name means tenet, which also happens to be a 2020 Christopher Nolan movie about principles and time-travel">1</a></sup> technology that’s turbocharging truthful tinkerers through these thrilling time travel tropes. Tenaciously tried; thoroughly tested - these turnstile<sup id="sf-back-to-the-future-2-back"><a href="proxy.php?url=#sf-back-to-the-future-2" class="simple-footnote" title="Red and Blue turnstiles are used in Tenet to travel back and forth the linear timeline ">2</a></sup> toys. Tata! :)</p> </blockquote> <p>Context for people reading this after the post date- This post was conceived and posted sometime in January, 2021. Marvel at the choice of the post date.<label for="sn-brain" class="margin-toggle">†</label><input type="checkbox" id="sn-brain" class="margin-toggle"><span class="marginnote">P.S. - palindrome</span> and thanks for reading! Happy Niu Year<sup id="sf-back-to-the-future-3-back"><a href="proxy.php?url=#sf-back-to-the-future-3" class="simple-footnote" title=" CNY 2021 Ox ">3</a></sup> to all :)</p> </section><ol class="simple-footnotes"><li id="sf-back-to-the-future-1">My name means <em>tenet</em>, which also happens to be a <a href="proxy.php?url=https://www.imdb.com/title/tt06723592/">2020 Christopher Nolan movie</a> about principles and time-travel <a href="proxy.php?url=#sf-back-to-the-future-1-back" class="simple-footnote-back">↩</a></li><li id="sf-back-to-the-future-2">Red and Blue turnstiles are used in <em>Tenet</em> to travel back and forth the linear timeline <a href="proxy.php?url=#sf-back-to-the-future-2-back" class="simple-footnote-back">↩</a></li><li id="sf-back-to-the-future-3"> CNY 2021 Ox <a href="proxy.php?url=#sf-back-to-the-future-3-back" class="simple-footnote-back">↩</a></li></ol>Jeet Kune Do!2021-02-01T02:04:00+08:002021-02-01T02:04:00+08:00Siddhant Shrivastavatag:sidcode.github.io,2021-02-01:/letters/jeet-kune-do/<section> <p>Even though this letter is just a draft right now, I am publishing it anyway. It’ll stay on the front page eventually forcing me to edit it so it’s not a grot letter anymore.</p> <p>This letter is dedicated to the logo that you see on the site. It …</p></section><section> <p>Even though this letter is just a draft right now, I am publishing it anyway. It’ll stay on the front page eventually forcing me to edit it so it’s not a grot letter anymore.</p> <p>This letter is dedicated to the logo that you see on the site. It’s a member of the <em>Taijitu</em> family <sup id="sf-jeet-kune-do-1-back"><a href="proxy.php?url=#sf-jeet-kune-do-1" class="simple-footnote" title="the Yin Yang symbol being one of the popular ones">1</a></sup>.</p> <p>JKD is Bruce Lee’s contribution to Martial Arts. Capturing the best-of-“insert your favourite martial art”, he created <em>Jeet Kune Do</em> - a pinnacle of duelism (or even dualism). </p> <h2>Here’s a Computer Science or record-keeping analogy to the philosophy/principle/tenet/सिद्धांत that’s core to Jeet Kune Do-</h2> <ul> <li>Create what is novel</li> <li>Retain what works</li> <li>Update what exists</li> <li>Delete what does not work</li> </ul> <p>SIDTODO: Ip Man, Kalaripayattu, Taijitu, duality, etc.</p> </section><ol class="simple-footnotes"><li id="sf-jeet-kune-do-1">the Yin Yang symbol being one of the popular ones <a href="proxy.php?url=#sf-jeet-kune-do-1-back" class="simple-footnote-back">↩</a></li></ol>Winning the National Blockchain Challenge2019-06-22T20:50:00+08:002019-06-22T20:50:00+08:00Siddhant Shrivastavatag:sidcode.github.io,2019-06-22:/letters/winning-the-national-blockchain-challenge/<h1>Context</h1> <p>Aung and I responded to the following interview soon after the National Blockchain Challenge 2019. This was a pivotal moment for me to understand the importance of this emerging technology in an emerging smart nation such as Singapore.</p> <h2>BlockBuster - Siddhant Shrivastava, Aung Maw, Sanskar Shrivastava, and Tarun Kumar Vangani …</h2><h1>Context</h1> <p>Aung and I responded to the following interview soon after the National Blockchain Challenge 2019. This was a pivotal moment for me to understand the importance of this emerging technology in an emerging smart nation such as Singapore.</p> <h2>BlockBuster - Siddhant Shrivastava, Aung Maw, Sanskar Shrivastava, and Tarun Kumar Vangani | Post-NBC&lsquo;19 Interviews</h2> <p><strong>Sid</strong>dhant Shrivastava, Aung Maw, Sanskar Shrivastava, and Tarun Kumar Vangani participated in the <a href="proxy.php?url=https://nbc.devpost.com/">NBC&lsquo;19 Hackathon</a> recently, and we caught up with them after the event to ask them about how it went for them.</p> <p>Their <a href="proxy.php?url=https://drive.google.com/drive/folders/19YGIej8HdK9QfdwqksnJ2Y50qa_KGl0o">presentation is available on Google Drive</a>.</p> <p>Sid and Aung presented the technical and business pitches on the day of the Finale. Their project, aptly named BlockBusters, bagged the first prize overall and the Vertical Prize in the Cybersecurity category of the inaugral <a href="proxy.php?url=https://www.globaltechchallenge.com/">event</a>.</p> <p>Enjoy the interview!</p> <hr> <p><strong>Brendan</strong>: Hi!</p> <p><strong>Sid and Aung</strong> : Heya Brendan and our dear readers! Thanks for helping make BlockBuster the blockbuster project in the inaugral National Blockchain Challenge 2019 😊 </p> <p><strong>Brendan</strong>: Tell us a bit about your project&hellip;</p> <p><strong>Sid and Aung</strong>: Before we do that, we must tell that we took the &lsquo;you are allowed to pivot your idea at any time&rsquo; rule too seriously and ended up changing our project idea thrice (right up until the evening of Day 2)! Even the name of the project went through two major iterations (formerly called BlockBox - which was an unimaginative play on the black box nature of blockchain and &ldquo;The Box&rdquo; from the HBO show &lsquo;Silicon Valley&rsquo;) 😂</p> <p>But it all ended up working out for the best as our final idea of BlockBusters closely matched our original idea of securing decentralized systems, such as Internet of Things Devices using a decentralized technology such as the Blockchain.</p> <p>Here&rsquo;s the elevator pitch - &ldquo;BlockBuster detects the tampered blocks of data in any storage system and recovers the original data back for you, even in the case of cyber-attacks.&rdquo;</p> <p>BlockBuster is a cyber-physical security solution for securing time-series databases through the principles of Blockchain and Distributed Systems. The prototype for the hackathon focuses on tamper-proofing the centralized database of a sample Power/Energy System with six stages. It is common for Critical Infrastructures of a city to have these centralized databases which are prone to cyberattacks and modifications. This makes the task of forensic investigation about the attack quite intractable. BlockBuster solves this issue of data-tampering of this centralized &lsquo;historian&rsquo; database by enforcing replication and immutability of data (and hence &lsquo;busts&rsquo; the harmful effects of attacks on the valuable plant operation data). BlockBuster replicates data across storage nodes to verify and recover the data in case the database is tampered with after a cyber-attack takes place. BlockBuster also works with existing attack detection techniques and enhances the integrity and availability of security devices themselves so they can be used for useful forensics and reconnaisance. Future plans include adding a service like Torus to establish a blockchain-based access control thereby bringing confidentiality to BlockBuster. For the prototype, we use a private Ethereum account, but the design of the system is generic enough to scale across a full-fledged operational plant, smart home, hospital, or a city using permissioned or public Ethereum or other blockchain networks as well.</p> <p><strong>Brendan</strong>: What I found interesting about your project was your focus on leveraging the properties of a blockchain as an immutable ledger, as well as having multiple redundant copies of the data, to allow a system to recover when some of the nodes have had their data corrupted by bad actors. Could you elaborate more on that?</p> <p><strong>Sid and Aung</strong>: Sure thing, Blockchain For The Win! Hackathons are a show-and-tell, so we&rsquo;d rather show it to you. Watch the demo <a href="proxy.php?url=https://youtu.be/3YolkE1U3gU?t=11965">here!</a></p> <p><strong>Brendan</strong>: Was that the idea that you originally started with, or did you change things along the way?</p> <p><strong>Sid and Aung</strong>: We pivoted our idea thrice during the hackathon - Smart Homes, Access Control, and then ended up on Critical Infrastructures. It is important to describe our workplace here since it inspired all of these ideas -</p> <p>We are both researchers in Prof. Aditya Mathur’s group at the iTrust Centre for Research in Cyber Security, Singapore University of Technology and Design. Our daily work focuses mainly on security in the design phase of Critical Cyber-Physical Systems, devising techniques to generate attacks, predict attacks, defend against these attacks, and also playing around with explorable security labs in Mixed/Virtual Reality. The lab has scaled-down versions of a city&rsquo;s water treatment plant, water distribution plant, a power grid, and an Internet of Things Testbed. The interconnected nature of these testbeds help us study realistic scenarios of cyber-physical attacks on a city like Singapore. </p> <p>In fact, one of the judges appreciated how a project like BlockBuster could have been used to assist in the attack detection and forensic investigation during the SingHealth cyber attack in 2018.</p> <p><strong>Brendan</strong>: Do you intend to continue working on the project, now that the hackathon is over?</p> <p><strong>Sid and Aung</strong>: Oh yes, indeed. In fact, Dr. Rex from BANSEA motivated us to ship the project in some form as soon as possible and we are working on it as we speak.</p> <p>The big idea that we see here is that BlockBuster can be used for securing security devices themselves, thereby making the &ldquo;Who Watches the Watchmen?&rdquo; problem easier to solve. Its decentralized nature makes it extremely computationally intensive to be attacked.</p> <p><strong>Brendan</strong>: Was there anything new that you learnt during the hackathon?</p> <p><strong>Sid, Aung, Sanskar, Tarun</strong>: That&rsquo;s a rhetoric question! All of us had different motivations during the event - but were united by the love for learning (after all we were at the <strong>Lifelong Learning Institute</strong>!).</p> <p>Ever since he saw the flyers for the event in the SUTD elevators, Sid wanted to learn more about Blockchain and relate it to cybersecurity (coming from a background in Computer Science, Design, and Virtual Reality). It was Aung&rsquo;s and Sanskar&rsquo;s first hackathon ever (and they ended up winning it!). Sanskar had just finished high school at the time of the hackathon and wanted to learn about how hackathons work and also got to dabble in vector graphics design. Tarun is exceptionally talented at machine learning and Python and wanted to branch out into this new promising technology through this event.</p> <p>But technical skills aside, we learned a lot about hustling, public speaking, sharing responsibility and ownership. In the spirit of Blockchain, we were a very efficient distributed and decentralized system ourselves throughout the event!</p> <p><strong>Brendan</strong>: Did you attend any of the workshop events that were held in the lead up to the hackathon?</p> <p><strong>Sid and Aung</strong>: Yes, we interacted quite a bit with the mentors, judges, and organizers all throughout the event as well. It was heartening to see the CEO of a company make changes to their product&rsquo;s API (in production worldwide) on the spot so we could use it conveniently.</p> <p>We are fortunate that the events that we could not attend were recorded and livestreamed so our other teammates could get up to speed. Kudos to Engineers.SG!</p> <p><strong>Brendan</strong>: What did you think about the hackathon in general?</p> <p><strong>Sid</strong>: Exceptionally well-organized for its first season. The organizers pulled it off as veterans in the field of organizing hackathons. The Slack channel was a treat to watch with an extremely high signal to noise ratio which is pretty rare for such events. A huge shoutout to Wing, Brendan, Daniel, Lewis Wong, who went out of their way to make our project better and understand our schedule.</p> <p>Besides that, the venue (<strong>Lifelong Learning Institute</strong>) was quite hassle-free and spacious to work at. All the logistics issues were well-taken care of. The mentors and judges were quite friendly and motivating. The mini-hackathon in the form of the gamified checklist did push us to learn some Business concepts (things we do for SWAG!)</p> <p><strong>Brendan</strong>: What was your favourite part about the hackathon?</p> <p><strong>Sid and Aung</strong>: The public speaking opportunity with the luminaries in the audience, hands down! There were not only CEOs and CTOs, but also people who get to decide a lot of policies around the region. To convince them that our project is worthwhile and get their positive feedback is an immense confidence boost for us to proceed further on this journey to help Singapore become the Cybersecurity powerhouse that it aims to become through the Digital Defense initiative. It is heartening to see a bubbling blockchain brotherhood of sorts in our little Red Dot with frequent meetups and startups in this space. Events like this, where a six year old kid can present a business pitch to such an audience, are exactly what we need to make Singapore greater in the tech scene (that was also our favourite part, by the way!).</p> <p><strong>Brendan</strong>: If we were to hold the hackathon again, what did you think could be improved?</p> <p><strong>Sid and Aung</strong>: Not much really, this is one of the better-organized hackathons that we&rsquo;ve been to. We especially liked the rewards design of the hackathon where the top three prizes have the same prize money (our position among the winners notwithstanding 🙃). We&rsquo;d recommend this and other events organized by the Global Tech Challenge to anyone (and even try participating in them)!</p> <p><strong>Brendan</strong>: Thank you </p> <p><strong>Sid and Aung</strong>: You&rsquo;re the best Brendan! Thanks for the interview and this unique opportunity :+1: </p> <hr> <p>We were quite excited to see all of the cool projects that were built during the event, and hope that you continue building them.</p> <p>For many participants, this event was not just about the competition, but also about the learning.</p>First Aid in the wild(erness)2017-02-02T15:59:00+08:002017-02-02T15:59:00+08:00Siddhant Shrivastavatag:sidcode.github.io,2017-02-02:/letters/first-aid-in-the-wilderness/<section> <p>This morning I attended a wilderness first aid workshop conducted by Muggy/Bhaskar - the guy who first showed me the ropes of roped climbing. SIDENOTE- Muggy was trained in both Wilderness First Aid and Wilderness First Response and chose to conduct this workshop pro-bono. Another climber provided his amazing office …</p></section><section> <p>This morning I attended a wilderness first aid workshop conducted by Muggy/Bhaskar - the guy who first showed me the ropes of roped climbing. SIDENOTE- Muggy was trained in both Wilderness First Aid and Wilderness First Response and chose to conduct this workshop pro-bono. Another climber provided his amazing office space on a Sunday. This was my first time seeing what a software consultant&rsquo;s life actually looks like.</p> <p>SIDTODO - share more about the day, the ambience, and the amazing bunch of people. The day started with some amazing <code>thatte idly</code> and <code>bisibelabath</code> - Bengaluru&rsquo;s pride.</p> <p>The session covered the different kinds of injuries- scratches, strains, sprains, fractures, bites, and every random scenario one will probably never face out in the wild except when they do. Muggy made an extensive powerpoint presentation pro-bono - this gesture to help and create caring communities continues to drive me in my effective altruistic ways circa 2020s.</p> <p>I learned how to apply bandages, create tourniquets, make makeshift stretchers out of ropes. We even did mock scenarios on one of our climbing buddies. It is recommended that weekend warrior wilderness folk (SIDENOTE- apparently that&rsquo;s what semi-adventurous citydwellers are called these days) practice these scenarios very frequently to keep up with the minutiae of these chops. The last thing we want is to make a snake bite worse by restricting bloodflow in the wrong area or drop a fractured friend due to wrong ropework while carrying them back from the crag. </p> <p>SIDTODO- Look up the notes I made for the day and share them with the world.</p> <p>Note from the future, circa late 2017- I got to apply the technique I learned to immobilize a possible hairline fracture on one of my goalkeeper friend&rsquo;s fingers, by using some steri-tape, when the fast and furious football fried his fingers rather than the palm. It was great to conduct a diagnostic test for fractures - knocking gently at the neighbouring joints. It was appalling to see how unaware most of us are until we indulge in these esoteric activies of our ancestors who survived in the wilderness. One friend suggested applying a muscle relaxant cream/spray on the finger. Fortunately, we convinced this well-meaning-but-utterly-disastrous advice remained just that - a conversation piece for nostalgia. </p> <p>Also another note from 2021- Blood Flow Restriction is not just semi-helpful in snake bites. It is also an effective driver of strength and hypertrophy at relatively low training intensities. This is backed by increasing amounts of evidence. I&rsquo;ll try applying this for some of my upper body training as I recuperate after a surgery.</p> <p><strong>Takeaway</strong>- always carry a well-stocked <code>wilderness</code> first aid kit (SIDENOTE- here&rsquo;s how it differs from a superficial first-aid kit), it saves lives!</p> </section>Telerobotics - Final Report2015-08-22T19:53:52+00:002015-08-22T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-08-22:/letters/telerobotics-final-report/<p>Hi all! Yesterday was the firm-pencils-down deadline for the Coding Period and the past week was one of the best weeks of the <em>Google Summer of Code 2015</em> program. I went all-guns-blazing with the <strong>documentation</strong> and <em>Virtual Machine distribution efforts</em> of my work on Telerobotics. I also added some significant …</p><p>Hi all! Yesterday was the firm-pencils-down deadline for the Coding Period and the past week was one of the best weeks of the <em>Google Summer of Code 2015</em> program. I went all-guns-blazing with the <strong>documentation</strong> and <em>Virtual Machine distribution efforts</em> of my work on Telerobotics. I also added some significant features to Telerobotics such as <em>ROS Integration with the EUROPA Scheduler</em> which <a href="proxy.php?url=https://shrigsoc.blogspot.in/2015/08/finals.html">Shridhar</a> worked on this summer with the Italian Mars Society.</p> <h1>Project Report</h1> <p>I completed the main aspects of the Telerobotics interface with strong results -</p> <ul> <li>Introduced Robot Operating Sytem (ROS) to ERAS</li> <li>Developed a Telerobotics Interface to Bodytracking and EUROPA</li> <li>Implemented Stereoscopic Streaming of 3-D video to the Blender Game Engine V-ERAS application</li> </ul> <p>I explain each of these points and summarize my experience in the following paragraphs. In the last week, I got a chance to pursue a collective effort in all the areas of my project -</p> <h2>Replication Experiments</h2> <p>The ultimate week began with attempts to ensure that my mentors could replicate my machine setup in order to test and comment on the performance of Telerobotics. To that end, I added detailed instructions to describe my machine and network configuration, which can be <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/machine-configurations.rst?at=default">found here</a>.</p> <h2>Docker Working!</h2> <p>I delineated the importance of Docker in this project in a <a href="proxy.php?url=https://sidcode.github.io/blog/2015/06/12/all-for-docker-docker-for-all/">previous post</a>. Franco started the ball rolling by telling me how the <a href="proxy.php?url=https://www.itsprite.com/openstack-docker-for-gui-based-environments/">ssh-to-image</a> method could be used for running Qt applications in Docker. ROS and Gazebo employ Qt extensively for their visualization and simulation applications. Thus, it was a non-functional requirement of Telerobotics. Thus the long-standing Docker issue was solved. The final Docker image with everything packaged can be used to test Telerobotics. The image can be pulled <a href="proxy.php?url=https://hub.docker.com/r/sidcode/ros-eras/">from here</a>. The instructions to use the image are in the <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/docker-instructions.rst?at=default">Telerobotics Documentation pages</a>.</p> <p>A walkthrough with the Docker image can be found in this YouTube video that I created -</p> <p><span class="videobox"> <iframe width="640" height="390" src='https://www.youtube.com/embed/cyGshc9RLoQ' frameborder='0' webkitAllowFullScreen mozallowfullscreen allowFullScreen> </iframe> </span></p> <h2>Fallback Keyboard Teleoperation</h2> <p>Telerobotics works out of the box with the Bodytracking module that Vito has developed. But in the unfortunate case when the Tango-Control server fails, there emerges the functional requirement to have a <strong>fallback interface</strong> in place. Seeking inspiration from the Teleoperation tools for ROS, I added the Fallback Keyboard Teleoperation interface. Thus, the Rover can now also be controlled with the Keyboard if need be. The controls are currently inclined towards right-handed astronauts. I hope to add the left-handed version soon as a minor extension of the interface. The code for this can be <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/src/teleoperation-keyboard.py?at=default">found here</a>.</p> <h2>EUROPA and Navigation Interfaces</h2> <p>Shridhar&rsquo;s work on the <a href="proxy.php?url=https://code.google.com/p/europa-pso/">EUROPA platform</a> needed access to the Telerobotics interface for the following tasks -</p> <ul> <li>Getting Robot Diagnostic Information</li> <li>Navigating the Robot to certain points</li> </ul> <p>I achieved the initial goal before midsems. The second goal was achieved this week after the EUROPA Planner was complete. The workflow to this end was to receive coordinates from the EUROPA Tango Server and send them to the ROS Node corresponding to the Husky.</p> <p>Finding the optimal path between two points on an incompletely-known map is solved by using <a href="proxy.php?url=https://wiki.ros.org/amcl">Augmented Monte Carlo Localization</a>.</p> <p>It is necessary to localize the rover with respect to its environment based on the inputs of its multiple sensors. The following diagram from the ROS website explains the concept - <img alt="ROS Localization" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/ros_localization.png"></p> <p>I used the Husky frame coordinates and added the code using the <a href="proxy.php?url=https://wiki.ros.org/actionlib">ROS Action Server and Action Client</a> and Tango Event Listeners to create the appropriate <strong>Telerobotics-EUROPA interfaces</strong>. It can be <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/src/europa-navigation.py?at=default">found here</a>.</p> <h2>Minoru Camera Tools</h2> <p>The Minoru 3-D Camera that I used to prototype <a href="proxy.php?url=">streaming applications</a> for ERAS has obscure documentation for Linux platforms. I was able to setup the Minoru Calibration tools from a <a href="proxy.php?url=https://github.com/bashrc/libv4l2cam">Git clone</a> of the <a href="proxy.php?url=https://code.google.com/p/sentience/wiki/MinoruWebcam">original</a> <code>vl42stereo</code> package. I added them to the <code>streams</code> tree of the Telerobotics source code. It can be <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/streams/Minoru3D/v4l2stereo-calibrate-minoru/?at=default">accessed here</a>.</p> <h2>Documentation!</h2> <p>2023 Update - The documentation has been moved to: https://marscity.readthedocs.io/en/latest/servers/telerobotics/doc/index.html</p> <p>The following paragraph is being kept in for posterity&rsquo;s sake.</p> <p>The documentation underwent a major overhaul this week. In addition to <strong>commenting the code</strong> since the beginning, I ensured to update/add the following documentation pages -</p> <ul> <li><a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/sad.rst?at=default">Software Architecture Document for Telerobotics</a></li> <li><a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/docker-instructions.rst?at=default">Docker Image Setup Instructions</a></li> <li><a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/telerobotics-guide.rst?at=default">Telerobotics Walkthrough</a></li> <li><a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/setup-minoru.rst?at=default">Minoru Camera Calibration and Instructions</a></li> <li><a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/erasvr/doc/setup.rst?at=default">Oculus Rift Troubleshooting and Installation</a></li> <li><a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/setup-ffmpeg.rst?at=default">Video Streaming FFmpeg Manual</a></li> </ul> <p>The latest version of the documentation can be <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/?at=default">found here</a>.</p> <p>The excitement of the final moments can be ascertained from my commit patterns on the last day -</p> <p><img alt="Archive Tagging" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/gsoc_ends.png"></p> <blockquote> <p>Learning Experience</p> </blockquote> <p>The past 12 weeks (and an almost equivalent time before that during application period) have been transformative.</p> <p>Just to get an idea of the different tools and concepts that I&rsquo;ve been exposed to, here&rsquo;s a list -</p> <ul> <li>Tango Controls</li> <li>Robot Operating System</li> <li>Blender Game Engine</li> <li>Oculus Rift</li> <li>FFmpeg</li> <li>Stereoscopic Cameras</li> <li>Video4Linux2</li> <li>Python</li> <li>OpenVPN</li> <li>Docker</li> </ul> <p>That indicates a great deal of experience in terms of tools alone.</p> <blockquote> <p>I learned how to create software architecture documents, how to work in tandem with other developers, how to communicate in the Open Source Community, when to seek help, how to seek help, how to help others, how to document my work, how to blog, and much more.</p> </blockquote> <p>With so many things to say, here&rsquo;s what I must definitely acknowledge -</p> <blockquote> <p>Thank you Python Software Foundation, Italian Mars Society, and Google Open Source Programs Office for this opportunity!</p> </blockquote> <p>I seriously can&rsquo;t imagine a better way in which I could have spent the past summer. I got a chance to pursue what I wanted to do, got an amazing mentoring and umbrella organization, a fascinating group of peers to work with, and arguably the best launchpad for Open Source contributions - the Google Summer of Code.</p> <p>Time for evaluations now! Fingers crossed :-)</p> <p>I have maintained a weekly-updated blog since the beginning of this summer of code. My organization required the blog frequency to be one post every two weeks. I loved blogging about my progress throughout. The eighteen posts so far can be found in the <a href="proxy.php?url=https://sidcode.github.io/category/gsoc.html">GSoC Category of my website</a>. In case you are interested in this project with the Italian Mars Society, you can follow the <a href="proxy.php?url=https://sidcode.github.io/tag/ims.html">page of my blog</a></p> <p>Ciao!</p>Telerobotics - The Penultimate Crescendo2015-08-14T19:53:52+00:002015-08-14T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-08-14:/letters/telerobotics-the-penultimate-crescendo/<p>Hi! As the hard-deadline date for the <em>Google Summer of Code</em> program draws to a close, I can feel the palpable tension that is shared by my mentors and fellow students at the <strong>Italian Mars Society</strong> and the <strong>Python Software Foundation</strong>.</p> <h1>All-Hands Meeting</h1> <p>We at the <em>Italian Mars Society</em> had …</p><p>Hi! As the hard-deadline date for the <em>Google Summer of Code</em> program draws to a close, I can feel the palpable tension that is shared by my mentors and fellow students at the <strong>Italian Mars Society</strong> and the <strong>Python Software Foundation</strong>.</p> <h1>All-Hands Meeting</h1> <p>We at the <em>Italian Mars Society</em> had the third all-hands meeting last evening (13th August). The almost two-hour Skype Conference call discussed a gamut of topics in-depth. Some of these were-</p> <h2><strong>Software Testing</strong> guidelines</h2> <p>Ezio described the various ways of <strong>Unit Testing</strong> in different applications like rover movements, bodytracking, etc. In my case I had been checking for setup prerequisites and establishing the serializability of the ROS system before other modules could start up. That way the it is successfully ensured that all the required distributed systems are up and running before they are used. <strong>Integration Testing</strong> is crucial in the ERAS application where things like Telerobotics, Bodytracking, and the EUROPA Planner all blend together seamlessly. I&rsquo;ve integrated Telerobotics and Bodytracking which can be observed in <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/commits/e87a0c1bfb46e0b8ba4b684b51060f8527aa1d6b">this commit</a>.</p> <h2>Telerobotics</h2> <p>Telerobotics in its current state is more precise than ever. This video demonstrates this fact -</p> <p><span class="videobox"> <iframe width="640" height="390" src='https://www.youtube.com/embed/94vfIr1cu7k' frameborder='0' webkitAllowFullScreen mozallowfullscreen allowFullScreen> </iframe> </span></p> <p>The YouTube link for the video is <a href="proxy.php?url=https://www.youtube.com/watch?v=94vfIr1cu7k">this</a>.</p> <p>I improved upon the previous integration with Bodytracking and handled the possible exceptions that may occur. The results have been stunning. I used the updated version of <a href="proxy.php?url=https://vigentile.wordpress.com/2015/07/31/enhancement-of-kinect-integration-in-v-eras-fifth-report/">Vito&rsquo;s bodytracker</a> which can detect closed hands. Since the sensor refresh-rate has been reduced to 30 times per second, the Telerobotics module has much smoother movements. Here is a snapshot of the Bodytracking application running in a Windows Virtual Machine -</p> <p><img alt="Visual Tracker" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/visual-3.png"></p> <h2>EUROPA Planner and Navigation Integration</h2> <p>Shridhar has been working on the Planner which outputs Cartesian coordinates in the format <code>(x,y)</code> to which the rover must navigate. I am using the <code>AMCL</code> navigation algorithm for known maps in addition to the <code>actionlib</code> server of ROS to facilitate this integration. The challenge here is to resolve between the Cartesian coordinates of <code>EUROPA</code> and that of <code>ROS</code>. This should be hopefully complete in the next couple of days.</p> <h2>AMADEE15 mission</h2> <p>Yuval described that the the recently concluded mission was a huge success which focused on the following frontiers-</p> <ul> <li>GPS integration with Blender</li> <li>Photogrammetry to reproduce Blender scenes for Virtual EVAs.</li> <li>Unity3D and Oculus Integration</li> <li>AoudaX realtime software</li> <li>Generic ERAS Data Logger</li> <li>Husky navigation</li> </ul> <p>Franco explained in brief about the <em>Neuro-vestibular</em> and <em>Husky</em> Scientific experiments.</p> <h2>Other things</h2> <p>Final efforts with Docker - After a lot of success, I have just one gripe with <strong>Docker</strong>. Running the Gazebo simulator, <code>rviz</code> (ROS visualizer), and the Telerobotics module requires THREE terminals.Working with ROS as a master inherently requires access to a lot of terminals for logging, echoing topic messages, starting programs, etc. The current ways to achieve multiple terminals and Qt applications in Docker are at best makeshift workarounds. To handle a graphics-heavy application like Telerobotics, we require a Graphical Environment. Docker is great for providing a common service framework but not so good at graphical applications like ROS. That&rsquo;s why I have been unable to get Docker working with the graphical aspects of ROS.</p> <h1>Documentation</h1> <p>In the final leg of the program, it is vital to go all-guns-blazing with the documentation of the software work that the students do. This is to ensure future development, maintainability, and clarity of thought. I recently added instructions in the Documentation directory - <code>telerobotics/doc/</code> to <strong>replicate my setup</strong>. This can be found in my <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/commits/dcd59b09d09f2e782e8c596f9f76b529eafd2151">current commit</a>.</p> <p>I am ensuring that my mentors would be able to replicate my setup and give feedback very soon. The last week of GSoC is quite frenzied with the action to produce a consistent wrap-up of the project. The next post will officially be the last post of my GSoC 2015 experience. In reality, of course, I would keep working on the project and keep blogging :)</p> <p>Till then, ciao.</p>Fine-tuning Telerobotics2015-08-07T19:53:52+00:002015-08-07T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-08-07:/letters/fine-tuning-telerobotics/<p>Hi! As discussed in the previous week, I have been <strong>able</strong> to get the <strong>integration of Telerobotics and Bodytracking</strong> up and running. Huge Victory :) Let me say the same thing in a much bolder typeface -</p> <h2>Integration Successful!</h2> <p>The following screenshot demonstrates what I&rsquo;m talking about -</p> <p><img alt="Telerobotics Integration" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/telerobotics-0.png"></p> <h2>Screen recording - YouTube Video …</h2><p>Hi! As discussed in the previous week, I have been <strong>able</strong> to get the <strong>integration of Telerobotics and Bodytracking</strong> up and running. Huge Victory :) Let me say the same thing in a much bolder typeface -</p> <h2>Integration Successful!</h2> <p>The following screenshot demonstrates what I&rsquo;m talking about -</p> <p><img alt="Telerobotics Integration" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/telerobotics-0.png"></p> <h2>Screen recording - YouTube Video</h2> <p>I used the same tool for screen capturing this integration that I used for real-time streaming from a 3-D camera. The output is as follows -</p> <p><span class="videobox"> <iframe width="640" height="390" src='https://www.youtube.com/embed/T3qbZaGvYao' frameborder='0' webkitAllowFullScreen mozallowfullscreen allowFullScreen> </iframe> </span></p> <p>If my blogging platform is unable to embed the video on the page, you could <a href="proxy.php?url=https://youtu.be/T3qbZaGvYao">use this link</a> to watch the first version of Telerobotics and Bodytracking integration. The Visual Tracker designed by Vito looks like this -</p> <p><img alt="Visual Tracker" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/visual-2.png"></p> <h2>Current Status</h2> <p>It is evident from the video that the setup is functional but not efficient. Moreover, it is buggy. The velocity values are way off the mark that ROS can take which results in jerks in Husky&rsquo;s motion. Also there is a disparity between the refresh rates of ROS and Tango-Controls which is identified by the Device not being unavailable intermittently.</p> <p>I strongly hope I&rsquo;ll be able to solve these issues in the next post. Of all the <strong>aha</strong> moments that I have been privy to, watching the Integration working was probably the biggest one of them all. It looks futuristic to me. With the Internet of Everything, a lot of things are going to use Teleoperation. I am so glad that we at the Italian Mars Society are gauging the future trends and experimenting with them in the present. I am honored to be facilitate that experiment.</p> <p>My next post is surely going to be a much more exciting run-down on how Telerobotics progresses :)</p> <p>Stay Tuned. Ciao!</p>Telerobotics and Bodytracking - The Rendezvous2015-07-31T19:53:52+00:002015-07-31T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-07-31:/letters/telerobotics-and-bodytracking-the-rendezvous/<p>Hi! The past week was a refreshingly positive one. I was able to solve some of the insidious issues that were plaguing the efforts that I was putting in last week.</p> <h2>Virtual Machine Networking issues Solved!</h2> <p>I was able to use the Tango server across the Windows 7 Virtual Machine …</p><p>Hi! The past week was a refreshingly positive one. I was able to solve some of the insidious issues that were plaguing the efforts that I was putting in last week.</p> <h2>Virtual Machine Networking issues Solved!</h2> <p>I was able to use the Tango server across the Windows 7 Virtual Machine and the Tango Host on my Ubuntu 14.04 Host Machine. The proper Networking mode for this turns out to be <strong>Bridged Networking mode</strong> which basically tunnels a connection between the Virtual Machine and the host.</p> <p>In the bridged mode, the Virtual Machine exposes a Virtual Network interface with its own IP Address and Networking stack. In my case it was <code>vm8</code> with an IP Address different from the IP Address patterns that were used by the <em>real</em> Ethernet and WiFi Network Interface Cards. Using bridged mode, I was able to maintain the Tango Device Database server on Ubuntu and use Vito&rsquo;s Bodytracking device on Windows. The Virtual Machine didn&rsquo;t slow down things by any magnitude while communicating across the Tango devices.</p> <p>This image explains what I&rsquo;m talking about -</p> <p><img alt="Jive on Windows and Ubuntu machines" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/jive_windows_ubuntu.png"></p> <p>In bridged mode, I chose the IP Address on the host which corresponds to the Virtual Machine interface - <code>vmnet8</code> in my case. I used the <code>vmnet8</code> interface on Ubuntu and a similar interface on the Windows Virtual Machine. I read quite a bit about how Networking works in Virtual Machines and was fascinated by the Virtualization in place.</p> <h2>Bodytracking meets Telerobotics</h2> <p>With Tango up and running, I had to ensure that <a href="proxy.php?url=https://vigentile.wordpress.com/2015/07/31/enhancement-of-kinect-integration-in-v-eras-fifth-report/">Vito&rsquo;s Bodytracking application</a> works on the Virtual Machine. To that end, I installed <em>Kinect for Windows SDK</em>, <em>Kinect Developer Tools</em>, <em>Visual Python</em>, <em>Tango-Controls</em>, and <em>PyTango</em>. Setting a new <em>virtual</em> machine up mildly slowed me down but was a necessary step in the development.</p> <p>Once I had that bit running, I was able to visualize the <strong>simulated Martian Motivity walk done in Innsbruck </strong> in a training station. The Bodytracking server created by Vito <em>published</em> events corresponding to the <code>moves</code> attribute which is a list of the following two metrics -</p> <ul> <li>Position</li> <li>Orientation</li> </ul> <p>I was able to read the attributes that the Bodytracking device was publishing by <strong>subscribing</strong> to Event Changes to that attribute. This is done in the following way -</p> <div class="highlight"><pre><span></span><code><span class="w"> </span><span class="k">while</span><span class="w"> </span><span class="nv">TRIGGER</span>:<span class="w"></span> <span class="w"> </span>#<span class="w"> </span><span class="nv">Subscribe</span><span class="w"> </span><span class="nv">to</span><span class="w"> </span><span class="nv">the</span><span class="w"> </span><span class="s1">&#39;moves&#39;</span><span class="w"> </span><span class="nv">event</span><span class="w"> </span><span class="nv">from</span><span class="w"> </span><span class="nv">the</span><span class="w"> </span><span class="nv">Bodytracking</span><span class="w"> </span><span class="nv">interface</span><span class="w"></span> <span class="w"> </span><span class="nv">moves_event</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="nv">device_proxy</span>.<span class="nv">subscribe_event</span><span class="ss">(</span><span class="w"></span> <span class="w"> </span><span class="s1">&#39;moves&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="nv">PyTango</span>.<span class="nv">EventType</span>.<span class="nv">CHANGE_EVENT</span>,<span class="w"></span> <span class="w"> </span><span class="nv">cb</span>,<span class="w"> </span>[]<span class="ss">)</span><span class="w"></span> <span class="w"> </span>#<span class="w"> </span><span class="k">Wait</span><span class="w"> </span><span class="k">for</span><span class="w"> </span><span class="nv">at</span><span class="w"> </span><span class="nv">least</span><span class="w"> </span><span class="nv">REFRESH_RATE</span><span class="w"> </span><span class="nv">Seconds</span><span class="w"> </span><span class="k">for</span><span class="w"> </span><span class="nv">the</span><span class="w"> </span><span class="k">next</span><span class="w"> </span><span class="nv">callback</span>.<span class="w"></span> <span class="w"> </span><span class="nv">time</span>.<span class="nv">sleep</span><span class="ss">(</span><span class="nv">REFRESH_RATE</span><span class="ss">)</span><span class="w"></span> </code></pre></div> <p>This ensures that the Subscriber doesn&rsquo;t exhaust the polled attributes at a rate faster than they are published. In that unfortunate case, an <code>EventManagerException</code> occurs which must be handled properly.</p> <p>Note the <code>cb</code> attribute, it refers to the Callback function that is triggered when an Event change occurs. The callback function is responsible for reading and processing the attributes.</p> <p>The processing part in our case is the core of the <strong>Telerobotics-Bodytracking interface</strong>. It acts as the intermediary between Telerobotics and Bodytracking - converting the <em>position</em>, and <em>orientation</em> values to <strong>linear and angular velocity</strong> that Husky can understand. I use a high-performance container from the <code>collections</code> class known as <code>deque</code>. It can act both as a stack and a queue using <code>deque.append</code>, <code>deque.appendleft</code>, <code>deque.pop</code>, <code>deque.popleft</code>.</p> <blockquote> <p>To calculate velocity, I compute the differences between consecutive events and their corresponding timestamps. The events are stored in a <code>deque</code>, popped when necessary and subtracted from the current event values</p> </blockquote> <p>For instance this is how <strong>linear velocity</strong> processing takes place -</p> <div class="highlight"><pre><span></span><code> # Position and Linear Velocity Processing position_previous = position_events.pop() position_current = position linear_displacement = position_current - position_previous linear_speed = linear_displacement / time_delta </code></pre></div> <h2>ROS-Telerobotics Interface</h2> <p>We are halfway through the Telerobotics-Bodytracking architecture. Once the velocities are obtained, we have everything we need to send to ROS. The challenge here is to use velocities which ROS and the Husky UGV can understand. The messages are published ot ROS <em>only</em> when there is some change in the velocity. This has the added advantage of minimzing communication between ROS and Tango. When working with multiple distributed systems, it is always wise to keep the communication between them minimial. That&rsquo;s what I&rsquo;ve aimed to do. I&rsquo;ll be enhacing the interface even further by adding Trigger Overrides in case of an emergency situation. The speeds currently are not ROS-friendly. I am writing a high-pass and low-pass filter to limit the velocities to what Husky can sustain. Vito and I will be refining the User Step estimation and the corresponding Robot movements respectively.</p> <p>GSoC is only becoming more exciting. I&rsquo;m certain that I will be contributing to this project after GSoC as well. The Telerobotics scenario is full of possibilities, most of which I&rsquo;ve tried to cover in my GSoC proposal.</p> <p>I&rsquo;m back to my university now and it has become hectic but enjoyably challenging to complete this project. My next post will hopefully be a culmination of the Telerobotics/Bodytracking interface and the integration of 3D streaming with Oculus Rift Virtual Reality.</p> <p>Ciao!</p>Virtual Machines + Virtual Reality = Real Challenges!2015-07-24T19:53:52+00:002015-07-24T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-07-24:/letters/virtual-machines-virtual-reality-real-challenges/<p>Hi! For the past couple of weeks, I&rsquo;ve been trying to get a lot of things to work. Linux and Computer Networks seem to like me so much that they ensure my attention throughout the course of this program. This time it was dynamic libraries, Virtual Machine Networking, Docker …</p><p>Hi! For the past couple of weeks, I&rsquo;ve been trying to get a lot of things to work. Linux and Computer Networks seem to like me so much that they ensure my attention throughout the course of this program. This time it was dynamic libraries, Virtual Machine Networking, Docker Containers, Head-mounted display errors and so on.</p> <p>A brief discussion about these:</p> <h2>Dynamic Libraries, Oculus Rift, and Python Bindings</h2> <p>Using the open-source Python bindings for the <strong>Oculus SDK</strong> available <a href="proxy.php?url=https://github.com/jherico/python-ovrsdk">here</a>, Franco and I ran into a problem -</p> <div class="highlight"><pre><span></span><code>ImportError: &lt;root&gt;/oculusvr/linux-x86-64/libOculusVR.so: undefined symbol: glXMakeCurrent </code></pre></div> <p>To get to the root of the problem, I tried to list all dependencies of the <strong>shared object file</strong> -</p> <div class="highlight"><pre><span></span><code> linux-vdso.so.1 =&gt; (0x00007ffddb388000) librt.so.1 =&gt; /lib/x86_64-linux-gnu/librt.so.1 (0x00007f6205e1d000) libpthread.so.0 =&gt; /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f6205bff000) libX11.so.6 =&gt; /usr/lib/x86_64-linux-gnu/libX11.so.6 (0x00007f62058ca000) libXrandr.so.2 =&gt; /usr/lib/x86_64-linux-gnu/libXrandr.so.2 (0x00007f62056c0000) libstdc++.so.6 =&gt; /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f62053bc000) libm.so.6 =&gt; /lib/x86_64-linux-gnu/libm.so.6 (0x00007f62050b6000) libgcc_s.so.1 =&gt; /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f6204ea0000) libc.so.6 =&gt; /lib/x86_64-linux-gnu/libc.so.6 (0x00007f6204adb000) /lib64/ld-linux-x86-64.so.2 (0x00007f6206337000) libxcb.so.1 =&gt; /usr/lib/x86_64-linux-gnu/libxcb.so.1 (0x00007f62048bc000) libdl.so.2 =&gt; /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f62046b8000) libXext.so.6 =&gt; /usr/lib/x86_64-linux-gnu/libXext.so.6 (0x00007f62044a6000) libXrender.so.1 =&gt; /usr/lib/x86_64-linux-gnu/libXrender.so.1 (0x00007f620429c000) libXau.so.6 =&gt; /usr/lib/x86_64-linux-gnu/libXau.so.6 (0x00007f6204098000) libXdmcp.so.6 =&gt; /usr/lib/x86_64-linux-gnu/libXdmcp.so.6 (0x00007f6203e92000) undefined symbol: glXMakeCurrent (./libOculusVR.so) undefined symbol: glEnable (./libOculusVR.so) undefined symbol: glFrontFace (./libOculusVR.so) undefined symbol: glDisable (./libOculusVR.so) undefined symbol: glClear (./libOculusVR.so) undefined symbol: glGetError (./libOculusVR.so) undefined symbol: glXDestroyContext (./libOculusVR.so) undefined symbol: glXCreateContext (./libOculusVR.so) undefined symbol: glClearColor (./libOculusVR.so) undefined symbol: glXGetCurrentContext (./libOculusVR.so) undefined symbol: glXSwapBuffers (./libOculusVR.so) undefined symbol: glColorMask (./libOculusVR.so) undefined symbol: glBlendFunc (./libOculusVR.so) undefined symbol: glBindTexture (./libOculusVR.so) undefined symbol: glDepthMask (./libOculusVR.so) undefined symbol: glDeleteTextures (./libOculusVR.so) undefined symbol: glGetIntegerv (./libOculusVR.so) undefined symbol: glXGetCurrentDrawable (./libOculusVR.so) undefined symbol: glDrawElements (./libOculusVR.so) undefined symbol: glTexImage2D (./libOculusVR.so) undefined symbol: glXGetClientString (./libOculusVR.so) undefined symbol: glDrawArrays (./libOculusVR.so) undefined symbol: glGetString (./libOculusVR.so) undefined symbol: glXGetProcAddress (./libOculusVR.so) undefined symbol: glViewport (./libOculusVR.so) undefined symbol: glTexParameteri (./libOculusVR.so) undefined symbol: glGenTextures (./libOculusVR.so) undefined symbol: glFinish (./libOculusVR.so) </code></pre></div> <p>This clearly implied one thing - <strong>libGL</strong> was not being linked. My task then was to <em>somehow</em> link libGL to the SO file that came with the Python Bindings. I tried out the following two options -</p> <ul> <li><strong>Creating my own bindings</strong>: Tried to regenerate the SO file from the Oculus C SDK by using the amazing <a href="proxy.php?url=https://github.com/davidjamesca/ctypesgen">Python Ctypesgen</a>. This method didn&rsquo;t work out as I couldn&rsquo;t resolve the <em>header</em> files that are requied by <em>Ctypesgen</em>. Nevertheless, I learned how to create Python Bindings and that is a huge take-away from the exercise. I had always wondered how Python interfaces are created out of programs written in other languages.</li> <li><strong>Making the existing shared object file believe that it is linked to libGL</strong>: So here&rsquo;s what I did - after a lot of searching, I found the nifty little environment variable that worked wonders for our Oculus development - <strong>LD_PRELOAD</strong></li> </ul> <p>As <a href="proxy.php?url=https://rafalcieslak.wordpress.com/2013/04/02/dynamic-linker-tricks-using-ld_preload-to-cheat-inject-features-and-investigate-programs/">this</a> and <a href="proxy.php?url=https://blog.chaselambda.com/2014/11/28/how-tmux-starts-up-an-adventure-with-linux-tools-updated.html">this</a> articles delineate the power of LD_PRELOAD, it is possible to force-load a dynamically linked shared object in the memory. If you set LD_PRELOAD to the path of a shared object, that file will be loaded before any other library (including the C runtime, libc.so). For example, to run <code>ls</code> with your special malloc() implementation, do this:</p> <p><code>$ LD_PRELOAD=/path/to/my/malloc.so /bin/ls</code></p> <p>Thus, the solution to my problem was to place this in the <code>.bashrc</code> file -</p> <p><code>LD_PRELOAD="/usr/lib/x86_64-linux-gnu/libGL.so"</code></p> <p>This allowed Franco to create the Oculus Test Tango server and ensured that our Oculus Rift development efforts continue with gusto.</p> <h2>ROS and Autonomous Navigation</h2> <p>On the programming side, I&rsquo;ve been playing around with <code>actionlib</code> to interface Bodytracking with Telerobotics. I have created a simple walker script which provides a certain degree of autonomy to the robot and avoids collissions with objects to override human teleoperation commands. An obstacle could be a Martian rock in a simulated environment or an uneven terrain with a possible ditch ahead. To achieve this, I use the <code>LaserScan</code> message and check for the range readings at frequent intervals. The <em>LIDAR</em> readings ensure that the robot is in one of the following states -</p> <ul> <li>Approaching an obstacle</li> <li>Going away from an obstacle</li> <li>Hitting an obstacle</li> </ul> <p>The state can be inferred from the LaserScan Messages. A ROS Action Server then waits for one of these events to happen and triggers the callback which tells the robot to stop, turn and continue.</p> <h2>Windows and PyKinect</h2> <p>In order to run Vito&rsquo;s bodytracking code, I needed a Windows installation. Running into problems with a 32-bit Windows 7 Virtual Machine image I had, I needed to reinstall and use a 64-bits Virtual Machine image. I installed all the dependencies to run the bodytracking code. I am still stuck with Networking modes between the Virtual Machine and the Host machine. The TANGO host needs to be configured correctly to allow the TANGO_MASTER to point to the host and the TANGO_HOST to the virtual machine.</p> <h2>Docker and Qt Apps</h2> <p>Qt applications don&rsquo;t seem to work with sharing the display in a Docker container. The way out is to create users in the Docker container which I&rsquo;m currently doing. I&rsquo;ll enable VNC and X-forwarding to allow the ROS Qt applications to work so that the other members of the Italian Mars Society can use the Docker container directly.</p> <h2>Gazebo Mars model</h2> <p>I took a brief look at the 3D models of Martial terrain available for free use on the Internet. I&rsquo;ll be trying to obtain the Gale Crater region and represent it in Gazebo to drive the Husky in a Martian Terrain.</p> <h2>Documentation week!</h2> <p>In addition to strong-arming my CS concepts against the Networking and Linux issues that loom over the project currently, I updated and added documentation for the modules developed so far.</p> <p>Hope the next post explains how I solved the problems described in this post. Ciao!</p>Streamed away (in Real-Time)!2015-07-16T19:53:52+00:002015-07-16T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-07-16:/letters/streamed-away-in-real-time/<p>Hi! This post is <em>all</em> about <strong>Video Streaming and Cameras</strong> :-) If you&rsquo;ve wondered how services like YouTube Live or twitch.tv work, then this post is for you. After the <em>Innsbruck experiments</em> and <a href="proxy.php?url=https://sidcode.github.io/blog/2015/07/08/remote-tests-in-telerobotics/">Remote tests in Telerobotics</a>, it was time for me to create a full-fledged Real Time Video …</p><p>Hi! This post is <em>all</em> about <strong>Video Streaming and Cameras</strong> :-) If you&rsquo;ve wondered how services like YouTube Live or twitch.tv work, then this post is for you. After the <em>Innsbruck experiments</em> and <a href="proxy.php?url=https://sidcode.github.io/blog/2015/07/08/remote-tests-in-telerobotics/">Remote tests in Telerobotics</a>, it was time for me to create a full-fledged Real Time Video Streaming solution for the ERAS project. After a lot of frustration and learning, I&rsquo;ve been able to achieve the following milestones - </p> <ol> <li>Stream losslessly from a single camera in real-time to a Blender Game Engine instance.</li> <li>Create example Blender projects to test <em>multiple video sources</em> streaming over a network.</li> <li>Record a <strong>live stream</strong> from a <strong>stereoscopic camera</strong> into a side-by-side video encoded on the fly. </li> </ol> <p>It&rsquo;s going to be a very long post as I&rsquo;ve been playing around with lots of video streaming stuff. All this experience has turned me into a confident Multimedia streamer.</p> <h2>Why am I doing this?</h2> <p>Integrating <em>Augmented</em> and <em>Virtual Reality</em> requires one to know the nitty-gritty of <strong>Multimedia Streaming</strong>. This week was spent in learning and tinkering with the various options provided by <a href="proxy.php?url=https://ffmpeg.org/">FFmpeg</a> and <a href="proxy.php?url=linuxtv.org/downloads/v4l-dvb-apis/">Video4Linux2</a>. One of the aims of the Telerobotics project is to allow streaming of Rover Camera input to the Astronaut&rsquo;s Head-Mounted Device (<strong>Minoru 3D</strong> camera and <strong>Oculus Rift</strong> in my case). The streamed video has multiple uses -</p> <ol> <li>It is used by the various Tango servers (Planning, Vision, Telerobotics, etc) and processed to obtain Semantic relationships between objects in the Martian environment.</li> <li>The video, in addition to the LIDAR and other sensing devices are the interface of the Human world in the ERAS habitat on Mars. The video stream provides a window to Mars.</li> <li>The real-time stream helps the astronaut and the simulated astronaut to guide the rover and the simulated rover around on Mars.</li> <li>Streaming is an integral component of both ERAS and V-ERAS which we at the Italian Mars Society are currently working on. </li> </ol> <h2>Initial Impressions</h2> <p>When I started with 3D streaming, it <em>appeared</em> easy. &ldquo;I did it with a single camera, two cameras can&rsquo;t be a huge deal, right!&rdquo;. <em>I had never been so wrong</em>. I found myself stuck in the usual embedded device vs the Linux kernel interface -</p> <ul> <li>The hardware of desktop machines are unsuitable for Streaming applications. </li> <li>The Kernel is not configured to use multiple webcams</li> <li>This results in lots of <strong>memory-related</strong> errors - <code>insufficient memory</code>, <code>rt_underflow</code></li> </ul> <p>To tweak the Minoru camera and strike an optimum settings agreement with this cute little stereo camera, I began to dig into the core software components involved -</p> <h2>Video4Linux2 saves the day!</h2> <p>The Video4Linux is an important driver framework which makes it possible for Linux users to use Video Capture devices (webcams and streaming equipment). It supports multiple features. The ones that this project is concerned with are -</p> <ul> <li>Video Capture/Output and Tuning (<code>/dev/videoX</code>, streaming and control)</li> <li>Video Capture and Output overlay (<code>/dev/videoX</code>, control)</li> <li>Memory-to-Memory (Codec) devices (<code>/dev/videoX</code>)</li> </ul> <p><a href="proxy.php?url=https://archive.fosdem.org/2014/schedule/event/v4l_intro/">These slides</a> by Hans Verkuil (Cisco Systems) are and informative entry point for understanding how Video4Linux works.</p> <p>The different Streaming Modes supported by Video4Linux are -</p> <ul> <li>Read/Write (<strong>Supported by Minoru</strong>) </li> <li>Memory Mapped Streaming I/O (<strong>Supported by Minoru</strong>)</li> <li>User Pointer Streaming I/O</li> <li>DMA (Direct Memory Access) Buffer Streaming I/O</li> </ul> <p>The take-away from Video4Linux is understanding how streaming works. So a Stream requires the following - queue setup, preparing the buffer, start streaming, stop streaming, wait to prepare, wait to finish, compression and encoding of the input stream, transmission/feeding on a channel, decompression and decoding the received stream, and facilities for playback and time-seek.</p> <p>The Qt frontend to <code>v4l2</code> made me realize where the problem with the camera lied -</p> <p><img alt="Qv4l2 Minoru" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/minoru-qv4l2.jpg"></p> <p>The <code>video4linux2</code> specification allows for querying and configuring <strong>everything</strong> about Video Capture Cards. The nifty command-line utitlity <code>v4l2-ctl</code> is a lifesaver while debugging cameras.</p> <p>For instance, with the Stereo Camera connected, <code>`v4l2-ctl --list-devices</code> gives -</p> <div class="highlight"><pre><span></span><code>Vimicro USB2.0 PC Camera (usb-0000:00:14.0-1.1): /dev/video1 Vimicro USB2.0 PC Camera (usb-0000:00:14.0-1.4): /dev/video2 WebCam SC-13HDL11939N (usb-0000:00:1a.0-1.4): /dev/video0 </code></pre></div> <div class="highlight"><pre><span></span><code>v4l2-ctl --list-frameintervals=width=640,height=480,pixelformat=&#39;YUYV&#39; </code></pre></div> <p>gives</p> <div class="highlight"><pre><span></span><code><span class="n">ioctl</span><span class="o">:</span><span class="w"> </span><span class="n">VIDIOC_ENUM_FRAMEINTERVALS</span><span class="w"></span> <span class="w"> </span><span class="n">Interval</span><span class="o">:</span><span class="w"> </span><span class="n">Discrete</span><span class="w"> </span><span class="mf">0.033</span><span class="n">s</span><span class="w"> </span><span class="o">(</span><span class="mf">30.000</span><span class="w"> </span><span class="n">fps</span><span class="o">)</span><span class="w"></span> <span class="w"> </span><span class="n">Interval</span><span class="o">:</span><span class="w"> </span><span class="n">Discrete</span><span class="w"> </span><span class="mf">0.067</span><span class="n">s</span><span class="w"> </span><span class="o">(</span><span class="mf">15.000</span><span class="w"> </span><span class="n">fps</span><span class="o">)</span><span class="w"></span> </code></pre></div> <p>This means that I&rsquo;ve to use one of these settings for getting input from the camera, and then transcode them into the desired stream characteristics.</p> <h2>Knowing your stereoscopic Camera</h2> <p><img alt="Stereo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/stereo-1.png"></p> <p>VLC carefully configured to stream the Left and Right Minoru Cameras/</p> <p><a href="proxy.php?url=https://www.minoru3d.com/">Minoru 3D</a> webcam uses the following <em>Color Spaces</em> -</p> <ol> <li>RGB3</li> <li>YU12</li> <li>YV12</li> <li>YUYV</li> <li>BGR3</li> </ol> <p>Explanations ahead&hellip;</p> <blockquote> <p>When colors meet computers and humans</p> </blockquote> <p>Color Spaces are models of &lsquo;Color Organization&rsquo; that enable reproducible representations of color in different media (analog, digital). Color is a human subjective visual perceptual property. Recursing these definitions on Wikipedia took me back to Middle School. Color is a physical (observable and measurable) property. The way us humans see it is not the same as a color sensing photodiodes see it and the computer monitors reproduce it. Translating color from one base to another requires a data structure known as the <strong>color space</strong>. The signals from the webcam are encoded into one of the color spaces. Just in case you&rsquo;re wondering - YUV model describes colors in terms of a <strong>Luma (luminance)</strong> component and two chrominance components (U and V). The 2-D UV plane can describe all colors. YUV can be converted into RGB and vice-versa. The YUV422 data format shares U and V values between two pixels. As a result, these values are transmitted to the PC image buffer only once for every two pixels, resulting in an average transmission rate of 16 bits per pixel. Capturing on the YUV 4:2:2 format is more efficient than RGB formats whereas color reproduction on a pixel array is more convenient via RGB. For the purposes of Video Streaming from a Stereo Camera System like Minoru, using a RGB color space is the best option because it results in faster performance with a codec like MJPEG (Multi-part JPEG) which is the final requirement for the Blender Game Engine stream. I hope this theoretical explanation superveniently describes the challenge I&rsquo;ve been trying to crack.</p> <p>FFmpeg built with <code>v4l2-utils</code> support is used for the Stereo Streaming.</p> <h2>Experiments with Blender</h2> <p>I tried capturing the two video devices directly from the Blender Game Engine application. It was a good experience learning about creating basic Blender Games.</p> <p><img alt="Blender Game" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/blender-try-two-sources.jpg"></p> <p>The workflow to this end was -</p> <ul> <li>Create two Cube Meshes</li> <li>Enable GLSL shading mode</li> <li>Set Object Shading to <code>Shadeless</code> to enhance brightness</li> <li>Add Image Textures to both images/articles/2015</li> <li>Add a <code>sensor</code> that is triggered to <code>True</code> <strong>always</strong>.</li> <li>Add a Python script controller corresponding to each sensor.</li> <li>The script to control the right camera of the stereo system is -</li> </ul> <div class="highlight"><pre><span></span><code><span class="kn">import</span> <span class="nn">VideoTexture</span> <span class="kn">import</span> <span class="nn">bge</span> <span class="n">contr</span> <span class="o">=</span> <span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="o">.</span><span class="n">getCurrentController</span><span class="p">()</span> <span class="n">obj</span> <span class="o">=</span> <span class="n">contr</span><span class="o">.</span><span class="n">owner</span> <span class="k">if</span> <span class="ow">not</span> <span class="nb">hasattr</span><span class="p">(</span><span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="p">,</span> <span class="s1">&#39;video&#39;</span><span class="p">):</span> <span class="n">matID</span> <span class="o">=</span> <span class="n">VideoTexture</span><span class="o">.</span><span class="n">materialID</span><span class="p">(</span><span class="n">obj</span><span class="p">,</span> <span class="s1">&#39;IMimage.png&#39;</span><span class="p">)</span> <span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="o">.</span><span class="n">video</span> <span class="o">=</span> <span class="n">VideoTexture</span><span class="o">.</span><span class="n">Texture</span><span class="p">(</span><span class="n">obj</span><span class="p">,</span> <span class="n">matID</span><span class="p">)</span> <span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="o">.</span><span class="n">video</span><span class="o">.</span><span class="n">source</span> <span class="o">=</span> <span class="n">VideoTexture</span><span class="o">.</span><span class="n">VideoFFmpeg</span><span class="p">(</span><span class="s2">&quot;/dev/video2&quot;</span><span class="p">,</span><span class="mi">0</span><span class="p">)</span> <span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="o">.</span><span class="n">video</span><span class="o">.</span><span class="n">source</span><span class="o">.</span><span class="n">scale</span> <span class="o">=</span> <span class="kc">True</span> <span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="o">.</span><span class="n">video</span><span class="o">.</span><span class="n">source</span><span class="o">.</span><span class="n">flip</span> <span class="o">=</span> <span class="kc">True</span> <span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="o">.</span><span class="n">video</span><span class="o">.</span><span class="n">source</span><span class="o">.</span><span class="n">framerate</span> <span class="o">=</span> <span class="mf">0.2</span> <span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="o">.</span><span class="n">video</span><span class="o">.</span><span class="n">source</span><span class="o">.</span><span class="n">repeat</span> <span class="o">=</span> <span class="o">-</span><span class="mi">1</span> <span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="o">.</span><span class="n">video</span><span class="o">.</span><span class="n">source</span><span class="o">.</span><span class="n">play</span><span class="p">()</span> <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;In Video 2 fps: &quot;</span><span class="p">,</span> <span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="o">.</span><span class="n">video</span><span class="o">.</span><span class="n">source</span><span class="o">.</span><span class="n">framerate</span><span class="p">)</span> <span class="n">bge</span><span class="o">.</span><span class="n">logic</span><span class="o">.</span><span class="n">video</span><span class="o">.</span><span class="n">refresh</span><span class="p">(</span><span class="kc">True</span><span class="p">)</span> </code></pre></div> <p>But it turns out Blender Game Engine does not provide extensive Video Device control. It relies on the default settings provided by Video4Linux. Since the Minoru camera is unable to stream both camera outputs at 30 frames per second - Blender simply gives in and compromises by playing the first camera output that it receives. Video4Linux simply reports <code>Insufficient Memory</code> for the other stream.</p> <p>The output could only support one camera at a time - <img alt="Blender cameras" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/blender-try-two-cameras.jpg"></p> <p>The BGE documentation is ambiguous in the use of the VideoTexture command while controlling webcam devices.</p> <p>It was an exciting learning experience about contemporary game design nevertheless. The take-away was that Blender Game Engine is unable to handle cameras at the hardware level. Network Streaming with FFmpeg was the only option. </p> <h2>FFmpeg - the one-stop-shop for Multimedia</h2> <p>My search for the perfect tool for streaming ended with FFmpeg. It amazes me how versatile this software is. Some people even call it the <a href="proxy.php?url=https://sonnati.wordpress.com/2011/08/08/ffmpeg-%E2%80%93-the-swiss-army-knife-of-internet-streaming-%E2%80%93-part-ii/">Swiss-army knife of Internet streaming</a>. So I had to basically work with Streams. Streams are essentially Multimedia resources which are identified with the help of a <em>Media Resource Locator</em> (<strong>MRL</strong>). A combination of <code>ffmpeg</code> and <code>ffserver</code> is what I used to achieve the desired results. The stereoscopic stream produced will be used by multiple applications-</p> <ol> <li>Streaming to the Head-Mounted Device (currently Oculus Rift)</li> <li>Processing Martian environment&rsquo;s video.</li> <li>View in the ERAS application from ground control.</li> </ol> <blockquote> <p>Why FFmpeg?</p> </blockquote> <ul> <li>It is fast, reliable, and free.</li> <li>It provides a complete solution from streaming and transcoding to media playback, conversion, and probe analysis.</li> </ul> <p>Quoting from its <a href="proxy.php?url=https://ffmpeg.org/ffmpeg.html">documentation</a> -</p> <blockquote> <p>ffmpeg reads from an arbitrary number of input &ldquo;files&rdquo; (which can be regular files, pipes, network streams, grabbing devices, etc.), specified by the -i option, and writes to an arbitrary number of output &ldquo;files&rdquo;, which are specified by a plain output static. Anything found on the command line which cannot be interpreted as an option is considered to be an output static. </p> </blockquote> <p>I tinkered with loads of <code>ffmpeg</code> options and created a lot of useful junkcode. The good thing about GSoC is that it makes you aware of the open-source influences out there. Throughout this work on streaming, I was motivated by the philosophy of <strong>Andrew Tridgell</strong> who says that <a href="proxy.php?url=https://samba.org/ftp/tridge/talks/junkcode.pdf">&ldquo;junkcode can be an important learning tool&rdquo;</a>.</p> <div class="highlight"><pre><span></span><code>ffmpeg -f v4l2 -framerate 15 -video_size 640x480 -i /dev/video1 outp1.mp4 -framerate 15 -i /dev/video2 outp2.mp4 </code></pre></div> <p>This resulted in a steady video stream -</p> <p>A sample of three different frames at </p> <div class="highlight"><pre><span></span><code>frame= 1064 fps= 16 q=27.0 q=27.0 size=631kB time=00:01:07.06 frame= 1072 fps= 16 q=27.0 q=27.0 size=723kB time=00:01:07.60 frame= 1079 fps= 16 q=27.0 q=27.0 size=750kB time=00:01:08.06 </code></pre></div> <p>Learning about the <code>ffmpeg-filters</code> made this experience worthwhile. I was not able to overlay videos side-by-side and combine them in real-time. This is the script that I used -</p> <div class="highlight"><pre><span></span><code><span class="n">ffmpeg</span><span class="w"> </span><span class="o">-</span><span class="n">s</span><span class="w"> </span><span class="mi">320</span><span class="n">x240</span><span class="w"> </span><span class="o">-</span><span class="n">r</span><span class="w"> </span><span class="mi">24</span><span class="w"> </span><span class="o">-</span><span class="n">f</span><span class="w"> </span><span class="n">video4linux2</span><span class="w"> </span><span class="o">-</span><span class="n">i</span><span class="w"> </span><span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">video1</span><span class="w"> </span><span class="o">-</span><span class="n">s</span><span class="w"> </span><span class="mi">320</span><span class="n">x240</span><span class="w"> </span><span class="o">-</span><span class="n">r</span><span class="w"> </span><span class="mi">24</span><span class="w"> </span><span class="o">-</span><span class="n">f</span><span class="w"> </span><span class="n">video4linux2</span><span class="w"> </span><span class="o">-</span><span class="n">i</span><span class="w"> </span><span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">video2</span><span class="w"> </span><span class="o">-</span><span class="n">filter_complex</span><span class="w"> </span><span class="ss">&quot;[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg];[1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w&quot;</span><span class="w"> </span><span class="o">-</span><span class="nl">c</span><span class="p">:</span><span class="n">v</span><span class="w"> </span><span class="n">libx264</span><span class="w"> </span><span class="o">-</span><span class="n">crf</span><span class="w"> </span><span class="mi">23</span><span class="w"> </span><span class="o">-</span><span class="n">preset</span><span class="w"> </span><span class="n">medium</span><span class="w"> </span><span class="o">-</span><span class="n">movflags</span><span class="w"> </span><span class="n">faststart</span><span class="w"> </span><span class="n">nerf</span><span class="p">.</span><span class="n">mp4</span><span class="w"></span> </code></pre></div> <p>It basically tells ffmpeg to use a resolution of 320x240 and 24 fps for each of the camera devices and apply an overlay filter to enable side-by-side video output. <code>PTS-STARTPTS</code> allows for time synchronization of the two streams and the presets enable efficient encoding.</p> <p>I shot a video using the Minoru video camera. After applying the Overlay filter, I got a nice video with the Left and Right video streams arranged side-by-side. In this screenshot, I am pointing my little brother&rsquo;s Nerf guns towards each of the Minoru&rsquo;s two cameras -</p> <p><img alt="Minoru Nerf Gun" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/minoru-nerf.png"></p> <p>I can experiment with the <strong>Stereoscopic anaglyph filters</strong> to extend it to a single-screen 3D live stream. But the present task involves streaming to the Oculus Rift which is what I&rsquo;ll be working on next. In addition to <code>ffmpeg</code>, I also made use of <code>ffserver</code> and <code>ffplay</code> in my Streaming workflow. These have been explained in a <a href="proxy.php?url=https://sidcode.github.io/blog/2015/07/01/mid-term-report-gsoc-15/">previous post</a>.</p> <h2>Experiments with <code>v4l2stereo</code></h2> <p>Working with stereoscopic cameras is atypical to a traditional Computer Vision workflow. Each of the cameras require calibration in order for Range-Imaging applications like depth maps and point clouds to work. I calibrated my camera using the excellent <a href="proxy.php?url=https://github.com/bashrc/v4l2stereo">v4l2stereo</a> tool.</p> <p>Here are some screenshots -</p> <p><img alt="Minoru Calibration" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/minoru-calibration.jpg"></p> <p>Basic Feature detection -</p> <p><img alt="Minoru Calibration" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/minoru-features.jpg"></p> <h2>Closing remarks</h2> <p>This was a very hectic couple of weeks. The output I produced pales in comparison to the tinkering that I had been doing. I&rsquo;ll be using all the important scripts that did not make it to the final repository in the documentation so that future students won&rsquo;t have to wade through the insurmountable learning curve of Multimedia Streaming. All the work regarding this can be found <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/a31a7a135eb0315c4d3aa4d968e0832666af14eb/servers/telerobotics/streams/?at=default">here</a>. I realized the overwhelming importance of IRC channels when I got help from #ffmpeg and #v4l2 channels when I was stuck with no end in sight. I gathered a GREAT DEAL of experience in Video Streaming which I hope will go a long way.</p> <p>This has been one giant bi-weekly report. Thank you for reading. <em>Ciao!</em></p>RIP, Mr. Iwata2015-07-13T16:07:04+00:002015-07-13T16:07:04+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-07-13:/letters/rip-mr-iwata/<p>Saturo Iwata died on July 11, 2015. I (coincidentally) started replaying Pokemon Emerald and Crystal on July 11. Completed most of Crystal in the next two days to celebrate his wonderful creation.</p> <p>More about him- https://bulbapedia.bulbagarden.net/wiki/Satoru_Iwata</p> <p>He got the battle system up and running in …</p><p>Saturo Iwata died on July 11, 2015. I (coincidentally) started replaying Pokemon Emerald and Crystal on July 11. Completed most of Crystal in the next two days to celebrate his wonderful creation.</p> <p>More about him- https://bulbapedia.bulbagarden.net/wiki/Satoru_Iwata</p> <p>He got the battle system up and running in Pokemon games. He was the producer of Pokemon crystal.</p> <p>Realizing the news of his death on July 14, I became misty-eyed (no Pokemon pun intended). This was the least coincidental thing that I&rsquo;d expect. Crystal was his baby. It is because of him that the whole Kanto region could be added to Pokemon GSC. He will be missed. I never thought I would take up Pokemon again after 2005.</p> <p>Regarding Video Games - I also saw VGHS this summer, which has its own take on Pokemon - Pokermon.</p> <p>I&rsquo;m misty-eyed now. I also played Pokemon Stadium which he developed. To honor his contribution, I played all the Pokemon console game soundtracks on a loop. </p>Remote tests in Telerobotics2015-07-08T19:53:52+00:002015-07-08T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-07-08:/letters/remote-tests-in-telerobotics/<p>Ciao :) The <em>sixth week</em> of GSoC 2015 got over. According to the <a href="proxy.php?url=https://erasproject.org/2015-gsoc/#2">Telerobotics project timeline</a>, this week was supposed to be the <strong>Buffer Week</strong> to account for any unforeseen work that may pop up. We at the <strong>Italian Mars Society</strong> were trying to get ROS communication possible over a <em>large …</em></p><p>Ciao :) The <em>sixth week</em> of GSoC 2015 got over. According to the <a href="proxy.php?url=https://erasproject.org/2015-gsoc/#2">Telerobotics project timeline</a>, this week was supposed to be the <strong>Buffer Week</strong> to account for any unforeseen work that may pop up. We at the <strong>Italian Mars Society</strong> were trying to get ROS communication possible over a <em>large</em> network. After effective discussion via mail and prioritizing on Trello, the <strong>first Husky test</strong> was scheduled on July 1, <strong>second test</strong> on July 7 and the <strong>third test</strong> on July 8. It was an international effort spanning timezones in UTC-5:00, UTC+2:00, and UTC+5:30 regions. So zeroing in on a common time was an interesting sub-challenge in itself.</p> <blockquote> <p>By a <strong>large</strong> network, I mean this -</p> </blockquote> <p><img alt="Remote Testing" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/remote-problem.png"></p> <p>On visceral observation, the problem statement looks quite tractable and practical. But like all problems in Computer Networks, this one looked easy in theory, but frustrated the budding Computer Scientist in me as the solutions proposed didn&rsquo;t work out.</p> <blockquote> <p>Husky Test 1</p> </blockquote> <p>Matt (from the Space Research Systems Group, Carleton University), Franco, and I were trying to get the Husky UGV in Canada to respond to the commands sent from the three parts of world involved (Canada, India, Italy). The few problems we came across -</p> <ol> <li> <p>ROS version issues caused a minor problem. The Husky robot was running an older version of ROS (Hydro) while Franco and I were using the newer version (Indigo). This caused problems in reading certain Husky messages. Solution - Upgrade ROS version on the Husky robot OR downgrade our version to ROS Hydro and Ubuntu 12.04.</p> </li> <li> <p>Network Issues - Unable to communicate with all three computers in all cases. There was no bidirectional communication between the ROS computers and ports were blocked.</p> </li> <li> <p><strong>Success</strong> - GPS Messages and status messages were received from the Husky robot laptop set as the ROS Master. But the Husky laptop was unable to receive Teleoperation messages from Franco&rsquo;s computer and my computer (even though it detected that we were publishing messages). Again a Network problem.</p> </li> </ol> <blockquote> <p>Solution - <strong>Virtual Private Networks</strong>, well almost&hellip;</p> </blockquote> <p>At first, I had to ensure that the TP-Link WiFi Router at home was not creating problems. To ensure this, I added my laptop interface in the <strong>Demilitarized Zone (DMZ)</strong>, and enabled <strong>Port Forwarding</strong> for all the ports of interest.</p> <blockquote> <p><em>Success</em> with Blender Game Engine Streaming</p> </blockquote> <p>Now, this solved quite a few problems - my public IP could now behave like one. To prove this, Franco and I held a Web-Stream session in which his laptop in Italy behaved as the Blender Game Engine Client while I provided a live video feed from the Minoru Camera while using a <strong>FFMpeg Server</strong>. His words - &ldquo;You are live. I can see the stream.&rdquo; provided the much-needed boost I required to tackle the pending Computer Networks problems I had to solve in the following couple of days.</p> <p>Coming to the VPN problem, I first read about the various VPN Server solutions available, like -</p> <ul> <li>OpenVPN</li> <li>PPTP (Point-to-Point Tunneling Protocol)</li> <li>IPSec</li> <li>SSH Tunneling</li> </ul> <p>The second Husky test was done with a PPTP VPN setup which wasn&rsquo;t quite succesful. The reason being - ROS requires bidirectional communication between the peers, and I couldn&rsquo;t become a peer while I was the VPN server. It caused a slew of other pesky problems like <code>REQ TIMEOUTS</code>, Disconnected ROS Nodes, disabling Internet on the VPN server, etc. But as a start, it was assuring that the problem could be solved. I realized that the learning curve for working with computers at the scale of the Internet is no child&rsquo;s play. But there was another takeaway with the second Husky test. Andrea (from the Husky team) could work with my remote node as the ROS master and still get the Husky up and running. This means that all the Husky traffic and node maintenance could be relegated through my PC and transferred to the Husky. <em>Much assuring.</em></p> <p>Armed with the Computer Networks concepts I learnt at my college, I set on to set up the slightly tougher OpenVPN server. This is a snapshot of the OpenVPN access server that I set up -</p> <p><img alt="OpenVPN users" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/openvpn-users.png"></p> <p>I was not only able to set up a world-wide VPN, but also able to set up communication among the peers. But the firewalls on the Husky computer network were strong for it and sent Andrea&rsquo;s laptop in a continous <em>Trying to Reconnect</em> loop. There went our hopes with OpenVPN. I am still looking into this issue. The main issue was that the UDP channel of OpenVPN was accessible in the Husky network but not the TCP channels. This caused intermittent connection losses and the OpenVPN client couldn&rsquo;t figure out what to do. There must be a solution to this and I&rsquo;ll find it.</p> <p>Throughout this experience, I learnt a lot of new things about practical Computer Networks. Once I&rsquo;m able to crack the VPN problem, I could put it to use in diverse scenarios (remote robotics testing, as a road warrior, Internet of Things applications, creating a network of friends, etc. ). VPN brings everyone on the same page (or logical subnet). I also did quite a bit of work with the Stereo Video Streaming which would be the theme of my next post. Stay tuned.</p> <p><em>Ciao!</em></p>Mid-term Report - GSoC '152015-07-01T19:53:52+00:002015-07-01T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-07-01:/letters/mid-term-report-gsoc-15/<p>Hi all! I made it through the first half of the <a href="proxy.php?url=https://sidcode.github.io/category/gsoc.html">GSoC 2015 program</a>. This is the <strong>evaluation week</strong> of the <a href="proxy.php?url=https://www.google-melange.com/gsoc/homepage/google/gsoc2015">Google Summer of Code 2015 program</a> with the <a href="proxy.php?url=https://www.python.org/psf/">Python Software Foundation</a> and the <a href="proxy.php?url=https://erasproject.org/">Italian Mars Society ERAS Project</a>. Mentors and students evaluate the journey so far in the program …</p><p>Hi all! I made it through the first half of the <a href="proxy.php?url=https://sidcode.github.io/category/gsoc.html">GSoC 2015 program</a>. This is the <strong>evaluation week</strong> of the <a href="proxy.php?url=https://www.google-melange.com/gsoc/homepage/google/gsoc2015">Google Summer of Code 2015 program</a> with the <a href="proxy.php?url=https://www.python.org/psf/">Python Software Foundation</a> and the <a href="proxy.php?url=https://erasproject.org/">Italian Mars Society ERAS Project</a>. Mentors and students evaluate the journey so far in the program by answering some questions about their students and mentors respectively. On comparing with the timeline, I reckoned that I am on track with the project so far.</p> <blockquote> <p>The entire <strong>Telerobotics with Virtual Reality</strong> project can be visualized in the following diagram -</p> </blockquote> <p><img alt="Project Architecture" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/telerobotics-diagram.png"></p> <h2>Achievements-</h2> <h3>Husky-ROS-Tango Interface</h3> <ul> <li><strong>ROS-Tango interfaces</strong> to connect the <strong>Telerobotics</strong> module with the <strong>rest of ERAS</strong>.</li> <li> <p>ROS Interfaces for Navigation and Control of Husky <img alt="Husky Navigation" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/navigate-ros.png"></p> </li> <li> <p><a href="proxy.php?url=https://sidcode.github.io/blog/2015/06/24/the-half-life-of-telerobotics/">Logging Diagnostics</a> of the robot to the Tango Bus</p> </li> <li>Driving the Husky around using human commands <img alt="Husky Commands" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/husky-command.png"></li> </ul> <h3>Video Streaming</h3> <ul> <li>Single Camera Video streaming to Blender Game Engine</li> </ul> <p>This is how it works. <strong>ffmpeg</strong> is used as the streaming server to which Blender Game Engine subscribes.</p> <p>The <code>ffserver.conf</code> file is configured as follows which describes the characterstics of the stream:</p> <div class="highlight"><pre><span></span><code>Port 8190 BindAddress 0.0.0.0 MaxClients 10 MaxBandwidth 50000 NoDaemon <span class="nt">&lt;Feed</span> <span class="err">webcam.ffm</span><span class="nt">&gt;</span> file /tmp/webcam.ffm FileMaxSize 2000M <span class="nt">&lt;/Feed&gt;</span> <span class="nt">&lt;Stream</span> <span class="err">webcam.mjpeg</span><span class="nt">&gt;</span> Feed webcam.ffm Format mjpeg VideoSize 640x480 VideoFrameRate 30 VideoBitRate 24300 VideoQMin 1 VideoQMax 5 <span class="nt">&lt;/Stream&gt;</span> </code></pre></div> <p>Then the Blender Game Engine and its associated Python library <code>bge</code> kicks in to display the stream on the <strong>Video Texture</strong>:</p> <div class="highlight"><pre><span></span><code># Get an instance of the video texture bge.logic.video = bge.texture.Texture(obj, ID) # a ffmpeg server is streaming the feed on the IP:PORT/FILE # specified in FFMPEG_PARAM, # BGE reads the stream from the mjpeg file. bge.logic.video.source = bge.texture.VideoFFmpeg(FFMPEG_PARAM) bge.logic.video.source.play() bge.logic.video.refresh(True) </code></pre></div> <blockquote> <p>The entire source code for single camera streaming can be found <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/v-eras-blender/src/42063c0b489152a9f124f80824ad095a752c29ff/scripts/webstream/single%20camera/?at=default">in this repository</a>.</p> </blockquote> <ul> <li>Setting up the <strong>Minoru Camera</strong> for stereo vision <img alt="Minoru Camera" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/minoru.jpg"></li> </ul> <p>It turns out this camera can stream at <strong>30 frames per second</strong> for both cameras. The last week has been particularly challenging to figure out the optimal settings for the Minoru Webcam to work. It depends on the Video Buffer Memory allocated by the <strong>Linux Kernel</strong> for <code>libuvc</code> and <code>v4l2</code> compatible webcams. Different kernel versions result in different performances. It is inefficient to stream the left and right cameras at a frame rate greater than 15 fps with the kernel version that I am using.</p> <ul> <li>Setting up the Oculus Rift DK1 for the <strong>Virtual Reality</strong> work in the upcoming second term <img alt="Oculus Rift" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/oculus-rift.jpg"></li> </ul> <h2>Crash-testing and Roadblocks</h2> <p>This project was not without its share of obstacles. A few memorable roadblocks come to mind-</p> <ol> <li> <p><strong>Remote Husky testing</strong> - Matt (from <strong>Canada</strong>), Franco (from <strong>Italy</strong>), and I (from <strong>India</strong>) tested whether we could remotely control Husky. The main issue we faced was <strong>Network Connectivity</strong>. We were all on different networks geographically, which the ROS in our machines could not resolve. Thus some messages (like GPS) were accessible whereas the others (like Husky Status messages) were not. The solution we sought is to create a <strong>Virtual Private Network</strong> for our computers for future testing.</p> </li> <li> <p><strong>Minoru Camera Performance differences</strong> - Since the Minoru&rsquo;s performance varies with the Kernel version, I had to bump down the frames per second to <em>15 fps</em> for both cameras and stream them in the Blender Game Engine. This temporary hack should be resolved as ERAS moves to newer Linux versions.</p> </li> <li> <p><strong>Tango related</strong> - Tango-controls is a sophisticated piece of SCADA library with a server database for maintaining device server lists. It was painful to use the provided GUI - Jive to configure the device servers. To make the process in line with other development activities, I wrote a little CLI-based Device server registration and de-registration interactive script. A <a href="proxy.php?url=https://sidcode.github.io/blog/2015/06/18/when-two-distributed-systems-meet/">blog post</a> which explains this in detail.</p> </li> <li> <p><strong>Common testing platform</strong> - I needed to use ROS Indigo, which is supported only on Ubuntu 14.04. ERAS is currently using Ubuntu 14.10. In order to enable Italian Mars Society and the members to execute my scripts, they needed my version of Ubuntu. <strong>Solution</strong> - Virtual Linux Containers. We are using a <strong>Docker Image</strong> which my mentors can use on their machine regarding of their native OS. <a href="proxy.php?url=https://sidcode.github.io/blog/2015/06/12/all-for-docker-docker-for-all/">This post</a> explains this point.</p> </li> </ol> <h2>Expectations from the second term</h2> <p>This is a huge project in that I have to deal with <em>many different technologies</em> like -</p> <ol> <li>Robot Operating System</li> <li>FFmpeg</li> <li>Blender Game Engine</li> <li>Oculus VR SDK</li> <li>Tango-Controls</li> </ol> <p>So far, the journey has been exciting and there has been a lot of learning and development. The second term will be intense, challenging, and above all, fun.</p> <p>To-do list -</p> <ol> <li>Get Minoru webcam to work with ffmpeg streaming</li> <li> <p>Use Oculus for an Augmented Reality application <img alt="Oculus Rift" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/oculus-mars.jpg"> <a href="proxy.php?url=https://vimeo.com/111243246">Source</a></p> </li> <li> <p>Integrate Bodytracking with Telerobotics</p> </li> <li>Automation in Husky movement and using a UR5 manipulator</li> <li>Set up a <a href="proxy.php?url=https://pptpclient.sourceforge.net/">PPTP</a> or <a href="proxy.php?url=https://openvpn.net/">OpenVPN</a> for ERAS</li> </ol> <p>Time really flies by fast when I am learning new things. GSoC so far has taught me how to not be a <a href="proxy.php?url=https://www.quora.com/What-are-the-characteristics-of-a-bad-software-engineer">bad software engineer</a>, but also how to be a good open source community contributor. That is what the spirit of Google Summer of Code is about and I have imbibed a lot. Besides, working with the Italian Mars Society has also motivated me to learn the Italian language. So Python is not the only language that I&rsquo;m practicing over this summer ;)</p> <blockquote> <p>Here&rsquo;s to the second term of Google Summer of Code 2015! <img alt="GSoC Banner" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/gsoc-banner.png"></p> </blockquote> <p>Ciao :)</p>The Half-Life of Telerobotics2015-06-24T19:53:52+00:002015-06-24T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-06-24:/letters/the-half-life-of-telerobotics/<p>Hi all! If you&rsquo;ve been following my <a href="proxy.php?url=https://sidcode.github.io/category/gsoc.html">previous posts</a>, you&rsquo;d have known that the Telerobotics module has been simmering for a couple of weeks. I&rsquo;m happy to announce that it is almost complete and would hopefully be integrated with Vito&rsquo;s Bodytracking module.</p> <p>The last week (week …</p><p>Hi all! If you&rsquo;ve been following my <a href="proxy.php?url=https://sidcode.github.io/category/gsoc.html">previous posts</a>, you&rsquo;d have known that the Telerobotics module has been simmering for a couple of weeks. I&rsquo;m happy to announce that it is almost complete and would hopefully be integrated with Vito&rsquo;s Bodytracking module.</p> <p>The last week (week four and five) were the busiest weeks of GSoC for me.</p> <h2>Learning Experience</h2> <ul> <li>I learnt A LOT about Python Software Development</li> <li>Different types of <a href="proxy.php?url=https://www.oreilly.com/programming/free/software-architecture-patterns.csp">software architectures</a>,</li> <li><a href="proxy.php?url=https://pyvideo.org/video/1093/the-development-process-of-python">The development process of Python</a> by one of the members of the Italian Mars Society who has been the reason I&rsquo;m able to write more Pythonic code - <a href="proxy.php?url=https://wolfprojects.altervista.org/">Ezio Melotti</a></li> <li><a href="proxy.php?url=https://www.esrf.eu/computing/cs/tango/tango_doc/kernel_doc/pytango/latest/quicktour.html#pytango-quick-tour">PyTango</a> Development</li> <li>ipython and how helpful it can be for Tango applications</li> <li>Message queues - Both ROS and Tango utilize ZeroMQ - which makes integration of ROS and Tango much scalable</li> <li><a href="proxy.php?url=https://www.vlfeat.org/overview/sift.html">SIFT</a> in Python - I will be working with my mentor Fabio Nigi on this very soon</li> <li>Making my own stereo camera</li> </ul> <h2>Deliverables</h2> <ul> <li>A <strong>ROS node</strong> which collects information from all interesting topics from the Husky robot. This can be found <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/db8c7061f4768534ebb2621296a20a016bd240ad/servers/telerobotics/src/robot-info-collector.py?at=default">here</a></li> <li>A <strong>Tango Server</strong> which integrates with ROS to provide diagnostic information from the robot (<em>Battery Status, Temperature Levels, Current Draw, Voltate, Error Conditions</em> )</li> <li>A simulated version of the Tango server for the Planning and Scheduling application that Shridhar is working on. These can be accessed <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/db8c7061f4768534ebb2621296a20a016bd240ad/servers/telerobotics/src/robot-diagnostics-server.py?at=default">here</a></li> <li><strong> Soft Real-time network streaming</strong> FFMPEG server and Blender Client for a single camera video stream. This can be found <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/v-eras-blender/src/42063c0b489152a9f124f80824ad095a752c29ff/scripts/webstream/single%20camera/?at=default">here</a></li> </ul> <h2>Under <strong>heavy</strong> Development</h2> <ul> <li>Integration of Bodytracking with Telerobotics. The following message format has been decided upon by the mentors and students:</li> </ul> <div class="highlight"><pre><span></span><code><span class="c1"># Attribute definitions for various diagnostic messages</span><span class="w"></span> <span class="w"> </span><span class="n">moves</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="n">attribute</span><span class="p">(</span><span class="n">label</span><span class="o">=</span><span class="s2">&quot;Linear and angular displacement&quot;</span><span class="p">,</span><span class="w"> </span><span class="n">dtype</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">(</span><span class="nb nb-Type">float</span><span class="p">,),</span><span class="w"></span> <span class="w"> </span><span class="n">display_level</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="n">DispLevel</span><span class="o">.</span><span class="n">EXPERT</span><span class="p">,</span><span class="w"></span> <span class="w"> </span><span class="n">access</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="n">AttrWriteType</span><span class="o">.</span><span class="n">READ</span><span class="p">,</span><span class="w"></span> <span class="w"> </span><span class="n">unit</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">&quot;(meters, radians)&quot;</span><span class="p">,</span><span class="w"></span> <span class="w"> </span><span class="n">fget</span><span class="o">=</span><span class="s2">&quot;getMoves&quot;</span><span class="p">,</span><span class="w"> </span><span class="n">polling_period</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="n">POLLING</span><span class="p">,</span><span class="w"></span> <span class="w"> </span><span class="n">max_dim_x</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="mi">2</span><span class="p">,</span><span class="w"> </span><span class="n">max_dim_y</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="mi">1</span><span class="p">,</span><span class="w"></span> <span class="w"> </span><span class="n">doc</span><span class="o">=</span><span class="s2">&quot;An attribute for Linear and angular displacements&quot;</span><span class="p">)</span><span class="w"></span> </code></pre></div> <p>Vito&rsquo;s Bodytracker would <strong>publish events</strong> in the form of Tango events. The associated data would be a float tuple of dimensions <strong>2,1</strong> (2 columns, 1 row). Such a tuple, like (3.4, 1.2) would specify a relative linear and angular displacement of the astronaut. My Telerobotics module would <strong>subscribe to this Tango event</strong> and <em>transform</em> this data to a <strong>Twist</strong> message that the Husky can understand.</p> <ul> <li>Extension of Camera Streaming to a dual camera setup. I am extending the streaming capabilty for a stereo camera.</li> </ul> <p>Mid-term evaluations start tomorrow! Eagerly looking forward to them. It has been an eventful and productive half summer of code. I hope the next half is even more exciting and challenging as the one that passed.</p> <p><em>Ciao</em></p>When two Distributed Systems meet!2015-06-18T19:53:52+00:002015-06-18T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-06-18:/letters/when-two-distributed-systems-meet/<p>Hi! This post is meant to be an insight into the experience and progress of the third and fourth weeks of my (a)vocation with the Google Summer of Code Program. Things got much pacier and smooth in the past two weeks. I&rsquo;ve been able to get a stable …</p><p>Hi! This post is meant to be an insight into the experience and progress of the third and fourth weeks of my (a)vocation with the Google Summer of Code Program. Things got much pacier and smooth in the past two weeks. I&rsquo;ve been able to get a stable codebase up and running with respect to the aims discussed in the timeline.</p> <p><img alt="Sublime Text Workspace" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/workspace2.png"> <the usual rant> I had to totally restructure my programming workspace for the second time to support Intelligent IDE like features since the Python packages I am working with (ROS and Tango) have a fair number of modules whose documentation I need to read on the fly while coding away. Thus I set up both my <strong>Vim and Sublime Text</strong> environments to support <em>intelli-sense</em>, <em>code completion</em>, <em>block syntax completion</em>, etc. I also added a dual monitor setup with the unused LCD television at my home to make for an efficient programming ambience. <usual rant></p> <h2>Telerobotics Code Pushed</h2> <p>As I mentioned in my <a href="proxy.php?url=https://sidcode.github.io/blog/2015/04/29/gsoc-2015-with-the-italian-mars-society/">first post</a>, the contributors of the <strong>Italian Mars Society</strong> are given <em>write access</em> to the online Bitbucket repository. This is a tremendous responsibility to ensure that the updates don&rsquo;t disturb the stability of the project. To work with this, I follow the simple and effective advice of my mentors -</p> <div class="highlight"><pre><span></span><code>hg pull hg update hg add . hg commit -m &quot;My awesome Commit Message&quot; hg push </code></pre></div> <p>This simple algorithm ensures that all students can work at their pace without breaking the system. <a href="proxy.php?url=https://hginit.com/">This simple tutorial</a> can help the uninitiated to understand what I just said.</p> <p>So while working with Tango servers for my project, I had to constantly use the bundled GUI - <strong>Jive</strong> which works as a one-stop solution for <a href="proxy.php?url=https://www.esrf.eu/computing/cs/tango/tango_doc/kernel_doc/pytango/latest/quicktour.html">Device Servers</a>. But my primordial hacker instincts prompted me to write a <a href="proxy.php?url=https://en.wikipedia.org/wiki/Command-line_interface">CLI</a> solution to add and remove device servers using the amazing <a href="proxy.php?url=https://www.esrf.eu/computing/cs/tango/tango_doc/kernel_doc/pytango/latest/#">PyTango API</a>. Thanks to Ezio&rsquo;s excellent comments on my commits, I&rsquo;ve been able to contribute a Pythonic solution for working with Device Servers in a jiffy. The script can be found <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/2da8222593354228a1eb426bef556654e794365c/servers/telerobotics/utility/setup-device.py?at=default">here</a>. It has a nice UI to help the user figure out what he/she needs to enter. I have yet to correct some formatting errors to make it more consistent with PEP8 and the <a href="proxy.php?url=https://docs.python.org//glossary.html#term-eafp">EAFP</a> idiom. The current stage of argument validation is more like LBYL (Look Before You Leap) which is slow for the script&rsquo;s use-case.</p> <p><strong>The second module</strong> I pushed is the <strong>Husky Test</strong> script to ensure if the Husky installation works or not on a particular setup. The <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/2da8222593354228a1eb426bef556654e794365c/servers/telerobotics/utility/test_husky.py?at=default">test script</a> which allows a Husky to move with a particular linear and angular velocity. The <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/2da8222593354228a1eb426bef556654e794365c/servers/telerobotics/doc/sad.rst?at=default">Software Architecture Document</a> was also updated to account for the new changes in the ROS-Tango interface architecture. A better understanding of the SAD can be had in <a href="proxy.php?url=https://sidcode.github.io/blog/2015/05/29/software-architecture-document-for-telerobotics/">an earlier post</a>.</p> <h2>Docker</h2> <p>I explained the Docker setup and distribution in a <a href="proxy.php?url=https://sidcode.github.io/blog/2015/06/12/all-for-docker-docker-for-all/">quick mini-post</a>. I tested that the X-errors don&rsquo;t impede with the scripts that I have been developing since ROS topics can be accessed from the command line as well. This is a good thing. The Docker repository for my workspace can be found <a href="proxy.php?url=https://registry.hub.docker.com/u/sidcode/ros-eras/">here</a>.</p> <h2>Python Reading</h2> <p>I have been voraciously consulting the following sources for getting the knack of Python and PyTango programming -</p> <ul> <li>Python Docs for <a href="proxy.php?url=https://docs.python.org/2/">Python 2</a> and <a href="proxy.php?url=https://docs.python.org/3/">Python 3</a></li> <li><a href="proxy.php?url=https://shop.oreilly.com/product/0636920027072.do">Python Cookbook</a> by O&rsquo;Reilly Publishers</li> <li><a href="proxy.php?url=https://shop.oreilly.com/product/0636920032519.do">Fluent Python</a> (early access) again by O&rsquo;Reilly Publishers</li> <li><a href="proxy.php?url=https://www.esrf.eu/computing/cs/tango/tango_doc/kernel_doc/pytango/latest/index.html">PyTango documentation</a></li> </ul> <p>The happiest point of all this reading kicked in when I could help Vito to reduce <strong>fifty lines of code to just two</strong> with the use of the <code>exec</code> construct in Python. In case you&rsquo;re wondering, this is the <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/commits/2da8222593354228a1eb426bef556654e794365c#Lservers/body_tracker/tracker/tracker.pyT40">code written by Vito</a> -</p> <div class="highlight"><pre><span></span><code><span class="w"> </span><span class="nv">joints</span><span class="w"> </span><span class="o">=</span><span class="w"> </span>[<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_head&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_neck&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_left_shoulder&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_right_shoulder&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_left_elbow&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_right_elbow&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_left_hand&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_right_hand&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_torso&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_left_hip&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_right_hip&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_left_knee&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_right_knee&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_left_foot&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="s1">&#39;skeleton_right_foot&#39;</span><span class="w"></span> <span class="w"> </span>]<span class="w"></span> <span class="w"> </span><span class="nv">attr_init_params</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="nv">dict</span><span class="ss">(</span><span class="w"></span> <span class="w"> </span><span class="nv">dtype</span><span class="o">=</span><span class="ss">(</span><span class="s1">&#39;float32&#39;</span>,<span class="ss">)</span>,<span class="w"></span> <span class="w"> </span><span class="nv">unit</span><span class="o">=</span><span class="s1">&#39;m&#39;</span>,<span class="w"></span> <span class="w"> </span><span class="nv">max_dim_x</span><span class="o">=</span><span class="mi">3</span>,<span class="w"></span> <span class="w"> </span><span class="nv">polling_period</span><span class="o">=</span><span class="nv">POLLING</span><span class="w"></span> <span class="w"> </span><span class="ss">)</span><span class="w"></span> <span class="w"> </span><span class="k">for</span><span class="w"> </span><span class="nv">joint</span><span class="w"> </span><span class="nv">in</span><span class="w"> </span><span class="nv">joints</span>:<span class="w"></span> <span class="w"> </span><span class="k">exec</span><span class="w"> </span><span class="s2">&quot;%s = attribute(**attr_init_params)&quot;</span><span class="w"> </span><span class="o">%</span><span class="w"> </span><span class="nv">joint</span><span class="w"></span> </code></pre></div> <p>Note that without the <code>exec</code> usage, each line would&rsquo;ve to be manually written for each of the joint that we see in the <code>joints</code> list.</p> <h2>Ongoing Stuff</h2> <p>There are certain deliverables in the pipeline currently waiting to be pushed to the online repository over the course of the next week. I have been working on -</p> <ul> <li>ROS-feedback Aggregator Device Server for Tango</li> <li>ROS Commander Node for the Husky</li> <li>Tango Client to understand Husky status (battery levels, sensor monitor, etc.)</li> <li>Mathematical Transformations and Named Tuples for different structures that Telerobotics requires.</li> </ul> <p>GSoC with PSF and Italian Mars Society is turning out to be fun-and-challenging. Mid-term Evaluations start in a week. Lots of work to do. I strongly hope my next post will be a celebratory one highlighting the pushed code I described in <em>Ongoing Stuff</em>.</p> <p>Until then, Ciao!</p>All for Docker; Docker for all!2015-06-12T19:53:52+00:002015-06-12T19:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-06-12:/letters/all-for-docker-docker-for-all/<p>Hi! This is going to be a short post about my developments in the Week 3 of my GSoC project. Since my <a href="proxy.php?url=https://sidcode.github.io/blog/2015/06/08/tango-ing-with-ros-week-2/">last post</a>, I have had the chance to work with some exciting state-of-the-art technologies which allow easy distribution and scalability. These are -</p> <ol> <li>Docker <img alt="Docker Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/docker-logo.png"></li> <li>Tango-Controls <img alt="Tango Controls logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/tangologo.png"></li> </ol> <p>I used the <a href="proxy.php?url=https://registry.hub.docker.com/_/ubuntu/">Ubuntu …</a></p><p>Hi! This is going to be a short post about my developments in the Week 3 of my GSoC project. Since my <a href="proxy.php?url=https://sidcode.github.io/blog/2015/06/08/tango-ing-with-ros-week-2/">last post</a>, I have had the chance to work with some exciting state-of-the-art technologies which allow easy distribution and scalability. These are -</p> <ol> <li>Docker <img alt="Docker Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/docker-logo.png"></li> <li>Tango-Controls <img alt="Tango Controls logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/tangologo.png"></li> </ol> <p>I used the <a href="proxy.php?url=https://registry.hub.docker.com/_/ubuntu/">Ubuntu 14.04</a> <em>Docker Container</em> to setup my system which can be used by anyone in the world as a common platform to test the applications that I am working on. This has multiple advantages -</p> <ul> <li>Setup-time for collaborators is null. The developer sets up the Docker container and the community members can use it directly.</li> <li>Host platform-independent. It doesn&rsquo;t matter whether the collaborator&rsquo;s host system is Arch Linux, Windows 8, or a specific version of Ubuntu. Docker uses <a href="proxy.php?url=https://www.toptal.com/linux/separation-anxiety-isolating-your-system-with-linux-namespaces">Linux namespaces</a> and ensures a separation of concerns.</li> <li>Revision control mechanism. The developer plays around with a Docker Image just as he/she would do with any other <strong>Distribution Revision Control system</strong>. I <strong>push</strong> my changes to the repository (Docker image) and my mentors can simply <strong>pull the updates</strong> to get the new system configuration.</li> </ul> <p>So far, I have setup Tango-Controls, ROS Indigo, and the Husky libraries for my Docker image. These can be found in the <a href="proxy.php?url=https://registry.hub.docker.com/u/sidcode/ros-eras/">Docker Registry Hub</a></p> <p>The issues that I am currently facing are -</p> <ul> <li>Graphics Problems. X-server Bad Drawing errors. A way to get around this will be to better understand how ROS applications use the X-server and then provide Docker the appropriate graphics capabilities. But this does not impede with the Command Line applications of ROS and Tango which I have been working on.</li> <li>MySQL connection problems. The workaround currently is to use the Host OS&rsquo;s Tango HOST. I observed that it works fine that way.</li> </ul> <p>This is it for this post. I mainly discussed about Docker in this post, which was an important thing that we discussed in the <strong>All-hands meeting on 8th June</strong>. I&rsquo;ll go into much more detail with Tango Controls in the upcoming blog posts and the biweekly reports.</p> <p>Ciao!</p>Tango-ing with ROS- Week 2!2015-06-08T00:53:52+00:002015-06-08T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-06-08:/letters/tango-ing-with-ros-week-2/<p>Hi! This one is about my <strong>second week of the Google Summer of Code 2015 program</strong>. It was a busy long <strong>week two</strong> with some crucial design decisions to be implemented and new things to learn. It was also a hectic week of reading how to write better Python code …</p><p>Hi! This one is about my <strong>second week of the Google Summer of Code 2015 program</strong>. It was a busy long <strong>week two</strong> with some crucial design decisions to be implemented and new things to learn. It was also a hectic week of reading how to write better Python code (<code>Fluent Python - O'Reilly Publishers</code>, maintaining Python2 and Python3 compatibility, etc) After finalizing on the architecture last week (shown below), it was time to work on implementing it -</p> <p><img alt="ROS and Tango" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/rostango.png"></p> <p>Evidently from the diagram, there are <strong>two distributed systems</strong> involved - both significantly complicated. These are -</p> <ul> <li>Tango Controls</li> <li>ROS (Robot Operating System)</li> </ul> <p>The challenge here is to create an <strong>event-triggered Tango Device</strong> which serves <strong>as both a client and a server</strong>. This Tango device listens for new events on the Tango bus, and sends data to it when need be. In addition, this is also interfaced with ROS in that the required Tango events for ROS are processed by the device and published to the appropriate <code>TangoROS</code> topic when required. It also subscribes to <code>ROSTango</code> topic to listen to any incoming updates from the robot.</p> <p>Some use-cases for this are as follows -</p> <ul> <li>The Bodytracking server pushes the location/orientation data on the bus.</li> <li>The TangoROS Device subscribes to the events of the Bodytracking data on the Tango bus.</li> <li>When an event is triggered, the device processes the data into ROS-compatible messages (<code>location</code> and <code>orientation</code> are <strong>transformed</strong> into <code>linear velocity</code> and <code>angular velocity</code>)</li> <li>The <em>ROS Commander</em> node (which is subscribed to the <code>ROSTango</code> topic)</li> <li>The ROS Commander node continuously monitors the robot for different measurements (<strong>sensor readings, battery status, navigation feedback, etc</strong>). The important signals are published to the <code>ROSTango bus</code>.</li> </ul> <p>This is my first time working with the powerful Tango-Controls system. It is used by -</p> <ol> <li>Italian Mars Society</li> <li>The very large solar array network (SAK)</li> <li>Synchotrons and Particle Accelerators around Europe</li> </ol> <p>I&rsquo;ll discuss how I work with Tango and ROS in my next blog post.</p> <p>The Italian Mars Society had an All-hands Skype meeting on 8th June, 2015 where all the GSoC students and mentors discussed project status, software architecture document feedback, roadblocks, hardware needs, collaboration, field tests etc.</p> <p>Things that were discussed and to be done-</p> <ul> <li>Docker Image for ROS setup (<strong>very important</strong>)</li> <li>Battery status Tango server</li> <li>ROS Tango Client</li> <li>ROS Tango server for certain use cases</li> <li>Tango events</li> <li>Timestamp based Transformation of parameters in a time-series data</li> <li>Set up the Minoru 3D camera and the Oculus Rift device</li> </ul> <p>This is a week where I&rsquo;d like most of these things to fall in place. GSoC is turning out to be exciting and challenging! Til the next post. Over to week three.</p> <p>Ciao!</p>Programming a Mars rover - Week 1!2015-06-03T00:53:52+00:002015-06-03T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-06-03:/letters/programming-a-mars-rover-week-1/<p>Hi! This is the sixth post in my <a href="proxy.php?url=https://sidcode.github.io/category/gsoc.html">GSoC &lsquo;15 blog series</a>.</p> <p>So the much awaited coding period began on 25th May, 2015. After a refreshing <a href="proxy.php?url=https://sidcode.github.io/blog/2015/05/23/gsoc-15-community-bonding/">Community Bonding</a> experience, <a href="proxy.php?url=https://sidcode.github.io/blog/2015/05/26/workspace-setup-for-telerobotics/">setting up my workspace</a>, and <a href="proxy.php?url=https://sidcode.github.io/blog/2015/05/29/software-architecture-document-for-telerobotics/">creating a Software Architecture Document</a> - I was in a position to start coding.</p> <h2>Aims and Milestones …</h2><p>Hi! This is the sixth post in my <a href="proxy.php?url=https://sidcode.github.io/category/gsoc.html">GSoC &lsquo;15 blog series</a>.</p> <p>So the much awaited coding period began on 25th May, 2015. After a refreshing <a href="proxy.php?url=https://sidcode.github.io/blog/2015/05/23/gsoc-15-community-bonding/">Community Bonding</a> experience, <a href="proxy.php?url=https://sidcode.github.io/blog/2015/05/26/workspace-setup-for-telerobotics/">setting up my workspace</a>, and <a href="proxy.php?url=https://sidcode.github.io/blog/2015/05/29/software-architecture-document-for-telerobotics/">creating a Software Architecture Document</a> - I was in a position to start coding.</p> <h2>Aims and Milestones</h2> <p>This week, according to the <a href="proxy.php?url=https://sidcode.github.io/blog/2015/05/07/gsoc-15-about-my-project/">timeline</a>, my aims were -</p> <ul> <li>Creating the initial set of ROS nodes for the Husky model for linear and angular motion </li> <li>Zeroing in on the basic interface for mapping the Kinect bodytracking information and Motivity interface being concurrently developed by Vito to teleoperation commands that Husky can understand</li> <li>Figuring out a way to integrate ROS and Tango into ERAS</li> </ul> <p>So far it has been a good week and I am on schedule. I am able to manipulate the motion of the simulated Husky via an external stimuli.</p> <h2>Architecture</h2> <p>Before I describe my programs, let me first describe the high-level architecture with help of a simple diagram -</p> <p><img alt="Telerobotics Architecture" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/arch2.png"></p> <p>As is evident from the diagram, there are <strong>two distributed systems</strong> involved - both fairly complicated. These are -</p> <ul> <li>Tango Controls</li> <li>ROS (Robot Operating System)</li> </ul> <p>This was by far the <strong>biggest challenge</strong> of the project. Interfacing data from one distributed system to the other while maintaining low latency and ensuring high performance.</p> <p>Another challenge was handling real-time streaming data. I banged my head against Python Streams. Message brokers like <a href="proxy.php?url=https://www.rabbitmq.com/">RabbitMQ</a> and <a href="proxy.php?url=https://zeromq.org/">ZeroMQ</a>. But as <strong>Albert Einstein</strong> said -</p> <blockquote> <p>“If you can&rsquo;t explain it to a six year old, you don&rsquo;t understand it yourself.” </p> </blockquote> <p>All this while, I was confused in transferring data over an <strong>additional</strong> inter-process communication structure between two distributed systems. Meh. Sounds complicated. It actually is. And that is why I chucked that idea out. After spending three full days on this, I realized a <strong>much simpler architecture</strong> -</p> <p><img alt="ROS and Tango" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/rostango.png"></p> <p><em>Voila!</em></p> <p>The good thing about this diagram is that it works at scale with as many ROS nodes one may like to add for the rover (Husky) without compromising on the data coming from the Tango bus. The <strong>missing piece</strong> of the <em>two distributed systems</em> puzzle is solved by a Tango ROS Node. Now I have a plan to work on in the second week of coding.</p> <p>These requirements had to be reflected in the Software Architecture Document as well. To this end, I set up the excellent <a href="proxy.php?url=https://github.com/timonwong/OmniMarkupPreviewer">OmniMarkupPreviewer</a> for <em>Sublime Text</em> to preview the <strong>reStructuredText</strong> (<strong>.rst</strong>) documents that I created.</p> <h2>Tryst with ROS and Husky</h2> <p>I had never worked with an Unmanned Ground Vehicle before. I did use ROS for robotics experiments at my university lab but needed to quickly jog my memory about ROS programming with <strong>rospy</strong>. The excellent <a href="proxy.php?url=https://wiki.ros.org/ROS/Tutorials">ROS wiki</a> and the book <strong>ROS By Example</strong> - </p> <p><img alt="ROS By Example" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/rbxlogo.png"></p> <p>It is a haven for robot hobbyists like me and I&rsquo;ll continue to refer to it for time to come.</p> <p>Alright, I started my week with ROS programming. My first job was to bring up the simulator and make sure that Husky model responds to commands -</p> <p>Husky (and other ROS robots) describes movements in the form of <a href="proxy.php?url=https://docs.ros.org/api/geometry_msgs/html/msg/Twist.html">Twist</a> messages -</p> <div class="highlight"><pre><span></span><code>[(x, y, z), (a,b,c) ] where x,y,z is linear velocity along the x,y,z axes. And (a,b,c) is the angular velocity about the x,y,z axes. </code></pre></div> <p>So to move in a circle, we issue [ (5,0,0) , (0,0,2) ]. This would result in a linear speed of 5 in the x direction and angular speed of 2 about the z axis, resulting in a circular motion.</p> <p>A simple way to explain the working is to use this command -</p> <div class="highlight"><pre><span></span><code>rostopic pub /husky_velocity_controller/cmd_vel geometry_msgs/Twist -r 100 &#39;[0.5,0,0]&#39; &#39;[0,0,0]&#39; </code></pre></div> <p>This publishes a Twist message to the Terminal telling the <strong>/husky_velocity_controller/cmd_vel</strong> <em>ROS topic</em> that the <a href="proxy.php?url=https://docs.ros.org/api/geometry_msgs/html/msg/Twist.html">Twist</a> denotes a linear motion of 0.5 m/s along the x direction.</p> <p>This is Husky in action -</p> <p><img alt="Husky in action" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/husky_in_action.png"></p> <p>To do the same using rospy, the procedure is simple -</p> <ul> <li>Import the required libraries (to support <em>rospy</em>, logging, and <em>Twist</em> messages)</li> </ul> <div class="highlight"><pre><span></span><code><span class="kn">import</span> <span class="nn">roslib</span> <span class="kn">import</span> <span class="nn">rospy</span> <span class="kn">from</span> <span class="nn">geometry_msgs.msg</span> <span class="kn">import</span> <span class="n">Twist</span> </code></pre></div> <ul> <li>Set up a ROS node - in this case <strong>move</strong></li> </ul> <div class="highlight"><pre><span></span><code>rospy.init_node(&#39;move&#39;) </code></pre></div> <p>ROS nodes act as identifiers (source and destination of messages) in the ROS distributed system (modeled as a graph)</p> <p>For instance, this is the ROS graph while the Husky is moving about -</p> <p><img alt="ROS Graph" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/rosgraph.png"> This is why ROS scales so well. Any number of publisher and subscriber nodes can be added to extend different applications.</p> <ul> <li>Set up a publisher to the appropriate ROS topic with the ROS message type </li> </ul> <div class="highlight"><pre><span></span><code>p = rospy.Publisher(&#39;husky_velocity_controller/cmd_vel&#39;, Twist, queue_size = 100) </code></pre></div> <p>The <code>queue_size</code> argument specifies the message buffer length, and allows for asynchronous transfer of messages on the ROS meesage queue.</p> <ul> <li>Construct a Twist Message</li> </ul> <div class="highlight"><pre><span></span><code>twist = Twist() twist.linear.x = 0.5; twist.linear.y = 0; twist.linear.z = 0; twist.angular.x = 0; twist.angular.y = 0; twist.angular.z = 0; </code></pre></div> <ul> <li>Publish the message</li> </ul> <div class="highlight"><pre><span></span><code>p.publish(twist) </code></pre></div> <p>That was easy, isn&rsquo;t it?</p> <p>Changing the attributes can allow the Husky to move in a circle and nautilus shape -</p> <p><img alt="Husky Circle" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/husky_circle.png"></p> <p>In this way, I proceeded in creating ROS nodes to accept Twist messages from any application and made a small teleoperation program on the lines of the <strong>Arrow</strong> server in ERAS. With the help of Franco, I set up the Arrow Tango server and obtained the attributes for distance and orientation.</p> <p>The next aim is to use the distance and orientation information on the Tango bus and map it to Husky commands so that it may move around appropriately on ground like this -</p> <p><img alt="Husky Nautilus" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/husky_nautilus.png"></p> <p><em>Random ROS Tidbit</em> - While working with ROS, I came across this interesting command <code>source</code></p> <p>Why do I call it interesting?</p> <p>It does not have a <strong>man-page</strong>, it does not have a <strong>&ndash;help</strong> or <strong>-h</strong> argument. It has one simple purpose -</p> <blockquote> <p>Execute the content of the file passed as argument <strong>in the current shell</strong></p> </blockquote> <p>Note that it is not the same as <strong>./</strong> which creates a new shell to run the command. Shells are nifty processes which allow other program processes to run. I wrote a shell from scratch for a Network Programming course assignment. You may find it <a href="proxy.php?url=https://github.com/sidcode/sigshell">here</a>.</p> <h2>Skype Meeting for Bodytracking</h2> <p>Franco, Yuval, Fabio, Ezio, Vito and I had an important meeting on 2nd May( a couple of hours before writing this post). The purpose of the meeting was <strong>Mapping Bodytracking with Telerobotics</strong>. The whole point of the project is to allow complete virtual and augmented reality immersion of the astronaut and the rover. This is what it means. The robot (a humanoid or a rover) should be able to mimic human action as much as possible. How? If the astronaut runs fast on the Motivity treadmill at a particular angle, the robot should move faster with that angle relative to the moving base position. This would make use of Vito&rsquo;s Kinect-based bodytracking module for determining incremental distance and orientation.</p> <p>Since Husky understands velocity in the Twist message, the distance/orientation information must be transformed into linear/angular velocity. I&rsquo;ll be working on it this week. </p> <p>Fabio brought up the important aspect of autonomy-control in the robotic system. He pressed upon the need of having three different stimuli to the robot -</p> <ul> <li>From the <strong>Bodytracking application</strong> (external)</li> <li>From the <strong>robot&rsquo;s onboard sensors</strong> (internal)</li> <li>From an external source</li> </ul> <p>This suggestion definitely adds robustness to the entire design, it will help the robot to avoid hitting a rock and override an astronaut&rsquo;s command in case of danger. I will look into it this week and keep semi-autonomy in Telerobotics in mind.</p> <p>Yuval talked about contacting the team in Canada which facilitated Husky during V-ERAS 14. The work that I do will be tested on a real Husky eventually. </p> <p>Adding a UR10 robotic arm to the Husky to facilitate manipulation and imitation of the human hand was also proposed. I&rsquo;ll look into that after the work on steering is complete.</p> <p>In this way, the meeting was <strong>quite important</strong> and a bunch of <strong>crucial decisions</strong> regarding <strong>Telerobotics and Bodytracking</strong> were taken.</p> <h2>The Week ahead</h2> <p>The following week, we&rsquo;ll have another meeting with all the students and possibly a joint code review session. I will be integrating ROS and Tango and adding support for different levels of Robot control through additional ROS nodes.</p> <h2>Summary</h2> <p>In hindsight, I was scared my GSoC coding experience would turn out be like this before the start of the <a href="proxy.php?url=">Coding period</a> -</p> <p><img alt="Coding By the Sill" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/codingbythesill.jpg"> Source - <a href="proxy.php?url=https://www.facebook.com/cluecomics">CLUE</a></p> <p>:) In fact I faced nothing like that (but the headphone and the loneliness is true :D ) There were minor setbacks. I had to reinstall ROS as a result of purging my MySQL configuration for Tango. Obviously these were the usual frustrations which crop up with computer programming and Linux, but nothing humongous. But this is where the <strong>Zen of Python</strong> kicks in! Using top-notch resources like the <code>logging</code> module, <code>rqt-graph</code>, and the inbuilt ROS logger; programming was a breeze. Add to it the awesomeness of Italian Mars Society. I faced a doubt in bodytracking, and six people decided on a Skype call to resolve the issues being faced, and resolve it we did, with gusto.</p> <p>The first week was super-hectic. Left with a computer and a programming problem; all-nighters were inevitable. It is proving to be a challenging and fun summer. </p> <p>Watch out for my next post in the GSoC 2015 series!</p> <p>Ciao!</p>Software Architecture Document for Telerobotics2015-05-29T00:53:52+00:002015-05-29T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-05-29:/letters/software-architecture-document-for-telerobotics/<h2>The First Three Days</h2> <p>Hi! Last couple of days were quite hectic. I am still getting used to the <em>7-hours a day</em> schedule of GSoC. But the good thing about GSoC is you can adjust the programming schedule according to your own convenience which is one more reason which takes …</p><h2>The First Three Days</h2> <p>Hi! Last couple of days were quite hectic. I am still getting used to the <em>7-hours a day</em> schedule of GSoC. But the good thing about GSoC is you can adjust the programming schedule according to your own convenience which is one more reason which takes it a notch above other summer coding programs. But I am devoting about 10 hours every day in these initial days to ensure I am well-positioned with respect to my timeline and also keep learning stuff on the go.</p> <p>Rants aside, I recently completed the first draft of the Software Architecture Document for my project.</p> <h2>Software Architecture Document</h2> <p><strong>Software Architecture Document</strong> (quite funnily abbreviated as SAD) is an important (read very) piece of information which entails and ensures what a software project is going to look like when it is built and shipped. I must thank the Italian Mars Society for giving me the much-needed push into the world of Open Source Software Engineering.</p> <p>Architecture of any program, especially open-source programs, as described in the excellent book <a href="proxy.php?url=https://aosabook.org/en/index.html">Architecture of Open Source Applications</a> describes software in terms of <strong>different</strong> layers of abstraction components, depending on who wants to look and improve upon it. Open source applications are a product of efforts of multiple people working on different aspects of a project together. To facilitate effective and non-redundant collaboration with proper version control, a software architecture document comes in handy.</p> <p>To put it in one line - </p> <blockquote> <p><strong>SAD ensures all developers, testers, and users are on the same page.</strong></p> </blockquote> <h2>SAD for Telerobotics Application</h2> <p>Take my <strong>Telerobotics application</strong> for instance. It is made up of three <em>distinct</em> <strong>features</strong> or <strong>functional requirements</strong> - </p> <ul> <li>Mapping Human body-tracking information to rover motion instructions</li> <li>Allowing real-time streaming of the rover&rsquo;s stereo camera-feed to the ERAS application</li> <li>Providing an Augmented reality interface obtained from the processing the rover sensor data</li> </ul> <p>Although I am the only developer working on these aspects currently, I must ensure that the application is in a <strong>well-maintained state</strong> throughout the life of the project. I must also ensure that a developer with skills in Robotics gets relevant information to the Robotics subsystem of the application (ROS knowledge). I must separate the concerns of a Network Communications developer from the user (the astronaut) while working on Real-time streaming from the rover to the Head Mounted Virtual Reality device.</p> <p>While the features describe the expected behaviour of the software system, they require a lot of background machinery which is essential for operation but not relevant for exposing to the end-user. These are <strong>non-functional requirements</strong>. To give an example, <strong>Robotics Operating System</strong> is used to manouver the Husky robot around. But the astronaut or the software system need not be concerned that robot communication, control, and command (C3 architecture) takes place using ROS or other robot platforms like YARP or Player/Stage. </p> <p>Non-functional requirements in turn are quite important for satisfying the performance requirements of the software system. For instance, the Real-time streaming protocol (RTSP) that I&rsquo;ll be working with soon directly impacts the performance requirement of <strong>Hard-Real Time streaming support.</strong></p> <p>The Software Architecture Document is generic in that it keeps in mind the evolving technology that may be used to cater to the application in focus. For instance, the <strong>Unmanned Ground Vehicle</strong> currently being considered is the <strong>Husky rover</strong>. It is my responsibilty to ensure that the logical layers are independent of the robot being used. The software should be <strong>extensible</strong> easily to a future ground vehicle that may use an altogether different control architecture than <strong>ROS</strong>.</p> <p>Finally, SAD is practical. It describes the timeline of development of the features.</p> <h2>My experience with SADs</h2> <p>Working on the SAD has been an immensely edifying experience for me for several reasons -</p> <ol> <li>My first foray into Software Engineering literature.</li> <li>Learning <em>reStructuredText</em> as the documentation tool for SAD.</li> <li>Appreciating how finely ingrained software-engineering principles are with Programming Language design. For instance, the sections of a SAD directly imbue the features of Object Oriented Programming (abstraction, encapsulation, separation of concern) and Functional Programming (Side effects, Higher-order functions).</li> </ol> <h2>Links to the document</h2> <p>If you are interested, the link to my <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/src/132fff239c3ff892f7cfc8836d3a2921244e444e/servers/telerobotics/doc/?at=default">software architecture document source is this</a>.</p> <p>The documentation on readthedocs can be found <a href="proxy.php?url=eras.readthedocs.org/en/latest/servers/telerobotics/doc/sad.html">here</a>.</p> <p>Until my next post on my first week of coding.</p> <p>Ciao!</p>Workspace Setup for Telerobotics2015-05-26T00:53:52+00:002015-05-26T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-05-26:/letters/workspace-setup-for-telerobotics/<p>Hi! Yesterday was the start of the <strong>coding period</strong> which will continue for another 12 weeks. The <a href="proxy.php?url=https://sidcode.github.io/blog/2015/05/23/gsoc-15-community-bonding/">Community Bonding period</a> gave me enough time to install the required packages. This post explains those packages in minimal detail.</p> <h2>Project Components</h2> <p>My work would heavily require the use of -</p> <ol> <li><strong>ROS (Robot Operating …</strong></li></ol><p>Hi! Yesterday was the start of the <strong>coding period</strong> which will continue for another 12 weeks. The <a href="proxy.php?url=https://sidcode.github.io/blog/2015/05/23/gsoc-15-community-bonding/">Community Bonding period</a> gave me enough time to install the required packages. This post explains those packages in minimal detail.</p> <h2>Project Components</h2> <p>My work would heavily require the use of -</p> <ol> <li><strong>ROS (Robot Operating System)</strong> to work with the <a href="proxy.php?url=www.clearpathrobotics.com/husky/">Husky Rover</a></li> </ol> <p><img alt="ROS Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/roslogo.png"></p> <p>ROS is the meta-operating system which is very popular with roboticists. My future posts would describe my work with ROS and the concepts that I am using, in detail.</p> <p>More specifically, I am working with ROS Indigo Igloo, which is a LTS (Long-term support) release</p> <p><img alt="Indigo Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/indigologo.png"></p> <ol start="2"> <li><strong>Gazebo Simulation environment</strong> to test the programs written to drive the Husky around</li> </ol> <p><img alt="Gazebo Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/gazebologo.png"></p> <p>I am working with Gazebo version 2.2.3.</p> <ol start="3"> <li><strong>Tango-Controls</strong> Supervisory Control and Data Acquistion system</li> </ol> <p>If data from different devices is the blood of ERAS, then Tango is the circulatory system. It does an excellent job of handling multiple devices (Motivity treadmill, Kinect Sensors, Blender Game Engine Instances, and in my case a ROS machine with Husky interfaces)</p> <p><img alt="Tango Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/tangologo.png"></p> <ol start="4"> <li><strong>Blender Game Engine</strong> to model the standalone V-ERAS application.</li> </ol> <p><img alt="Blender Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/blenderlogo.png"></p> <p>The V-ERAS simulation of the spacecraft looks like this -</p> <p><img alt="V-ERAS simulation" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/verassim.png"></p> <p>In the second phase of the project, I will be involved in real-time streaming of rover stereo camera feed to the displays in the V-ERAS simulation.</p> <ol start="5"> <li><strong>Python</strong> (of course :D )</li> </ol> <p><img alt="Python Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/python-logo.png"></p> <ol start="6"> <li><strong>Ubuntu 14.04 (Trusty Tahr)</strong></li> </ol> <p>ROS Indigo offers complete support for this version of Ubuntu.</p> <p><img alt="Ubuntu Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/ubuntulogo.png"></p> <h2>Screenshots</h2> <p>To Python-ify my experience even further, I installed <strong>Terminator</strong>, a Python-based program which makes terminal arrangement as flexible as humanly possible on Linux.</p> <p>Working with ROS requires opening up a lot of terminal and Terminator makes this hassle-free.</p> <p>Take a look for yourselves -</p> <p><img alt="Terminator" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/terminator.png"></p> <p>I am using different text editors for different purposes.</p> <p>While working with <strong>Markdown</strong> and <strong>reStructuredText</strong>, I use Sublime Text.</p> <p><img alt="Sublime Text Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/sublimelogo.png"></p> <p>Vim is my editor of choice for all things Python. I have been using it for open-source development since last year.</p> <p>So, with this I wrap up this setup post.</p> <p>Just for kicks, this is what my desktop looks like -</p> <p><img alt="Desktop IMS" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/desktop.png"></p> <p>I must admit it keeps me motivated to design software for Mars missions. Just in case you&rsquo;re wondering, the theme I use is the MacBuntu theme. It is pretty distraction-free.</p> <h2>To Coding and beyond!</h2>Room 276, Gandhi Bhawan, BITS Pilani2015-05-24T00:53:52+05:302015-05-24T00:53:52+05:30Siddhant Shrivastavatag:sidcode.github.io,2015-05-24:/letters/room-276-gandhi-bhawan-bits-pilani/<h2>Why write a post about an indistinguishable dorm room?</h2> <p><strong>Gandhi 276</strong> (Coordinates - 28.360874 N, 75.588507 E) ushers in a profuse stream of consciousness (and subconsciousness) for me.</p> <p>I made it my home for <strong>two</strong> long academic years (that is four semesters (2013-2015) and a summer (2015).</p> <p>Today (24 …</p><h2>Why write a post about an indistinguishable dorm room?</h2> <p><strong>Gandhi 276</strong> (Coordinates - 28.360874 N, 75.588507 E) ushers in a profuse stream of consciousness (and subconsciousness) for me.</p> <p>I made it my home for <strong>two</strong> long academic years (that is four semesters (2013-2015) and a summer (2015).</p> <p>Today (24 June, 2015), I am leaving it as I complete the final phase of packing up the lightweight items. Incidentally, today is also the date when I took my BITSAT exam in 2012. It also happily collides with the end of the community bonding period and the start of the coding period for GSoC which I explained in <a href="proxy.php?url=https://sidcode.github.io/letters/gsoc-15-community-bonding/">this post</a>.</p> <h2>How I got this room?</h2> <p>Out of pure whim. BITS Pilani allows students to choose their wings and rooms. The final wings are decided by a lottery system in cases of a collision. So yes, our wing (the <em>&lsquo;ghot&rsquo;</em> wing) got the upper back wing, and consequently I got this room.</p> <blockquote> <p>About Gandhi 276</p> </blockquote> <h2>Experiences with GN-276</h2> <p>I stepped foot in this room on <strong>July 31, 2013</strong>. The previous room occupant, <a href="proxy.php?url=https://in.linkedin.com/pub/lohi-uppalapati/2a/934/27a">U.R. Lohi</a> was the president of BITS Pilani Student Union for the 2012-2013 session. So in a way, I got the president&rsquo;s room, out of pure whim. <em>A cool and cringey bragging right to start with.</em></p> <blockquote> <p><strong>First Semester, 2013</strong></p> </blockquote> <ul> <li>Best time of my CS program</li> <li>Gel in with the new wingmates (called <em>wingies</em> at BITS). Sidenote: the wing was called the &ldquo;ghot&rdquo; wing.</li> <li>The distraction-free semester, even with the new laptop</li> <li>Programmed extensively in Java, Prolog, C</li> <li>The last semester with an advanced Mathematics course, which I enjoyed (<strong>Differential Equations</strong>)</li> <li>Worked hard as the <strong>Technical Team</strong> member of <a href="proxy.php?url=https://embryo.bits-pilani.ac.in"><strong>BITSEmbryo</strong></a></li> <li>Still a Windows user</li> <li>The start of my Robotics career</li> </ul> <blockquote> <p><strong>Second Semester, 2014</strong></p> </blockquote> <ul> <li>The lowest point of my CS program</li> <li>Watched more than 500 films</li> <li>Avid Linux user (full-time Archer)</li> <li>Active <strong>BITS Firefox Community</strong> member. Spent a lot of time to motivate people to use Free software</li> </ul> <blockquote> <p><strong>First Semester, 2014</strong></p> </blockquote> <ul> <li>The room gets a revamp.</li> <li>The pleasant <strong>research labs</strong> phase of my undergrad life. I also made some friends for life during this time.</li> <li>The start of my Wearable and Pervasive Computing career</li> <li>Extensive use of Arduino and Raspberry Pi</li> <li>Positive change in attitude, thanks to new friends</li> <li>Network Programming phase of life</li> <li>My first Ping of Death to <a href="proxy.php?url=https://www.quora.com/Shardul-Deshpande">Shardul</a> (on the same wing subnet)</li> <li>The start of internship applications</li> <li>Embracing rejections</li> </ul> <blockquote> <p><strong>Second Semester, 2015</strong> </p> </blockquote> <ul> <li>The most stressful period of my life</li> <li>Embracing even more rejections</li> <li><strong>Compiler Design</strong> semester</li> <li>Selection in the <a href="proxy.php?url=https://india.media.mit.edu/">MIT Media Lab Design Innovation Workshop 2015</a></li> <li>Made some awesome friends from all over India(and the world)</li> <li>Lost some awesome friends for stupid reasons</li> <li>Working for the <a href="proxy.php?url=https://www.google-melange.com/gsoc/homepage/google/gsoc2015">Google Summer of Code</a> as last respite</li> <li>Project proposal getting selected for GSoC 2015</li> <li>Juggling with Compilers, Robotics work, Computer Networks lab Teaching Assistantship, GSoC work, internship applications, Typeracing, Films</li> <li>The semester when I was out of spacetime</li> </ul> <blockquote> <p><strong>Summer 2015</strong> </p> </blockquote> <ul> <li>A hotchpotch of discipline and dust</li> <li>Dusty times with the wing being cleaned out</li> <li>Improved circadian rhythms and exercise</li> <li>The &ldquo;Getting shit done&rdquo; phase</li> </ul> <blockquote> <p>The common denominator in all these eventful times - <strong>Gandhi 276</strong>.</p> </blockquote> <h2>Thank You Gandhi 276.</h2> <p>For standing by me. For becoming a home which I&rsquo;d go on to endear more than my family home. For sheltering me from the sweltering heat and the frosty chills of this sleepy university town. I learnt to touchtype as fast as 100 words per minute in this room. My first Git commit. My first pull request. My first blog post. Thank you room for all the experiences, sicknesses, treatments, soporific ambiences, all-nighters, Gtalk &amp; IRC sessions, and above all - a nurturing environment where I grew up as a Computer Scientist.</p> <h2>Closing remarks</h2> <p>This has been a long farewell post. And this experience wouldn&rsquo;t have been this awesome if not for the people around this room - Karan, Kunal, Gaurav, Sai Charan, Shreyansh, Priyank, Girish. I wish all the best in life and beyond to the future and past occupants of this room.</p> <blockquote> <p>&ldquo;May the GN276 Force be with you!&rdquo;</p> <p>So long and thanks for all the fish.</p> </blockquote>GSoC '15 Community Bonding2015-05-23T00:53:52+00:002015-05-23T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-05-23:/letters/gsoc-15-community-bonding/<p>Third Post in the GSoC 2015 series. Here I&rsquo;ll take you through the engaging community bonding experience.</p> <h2>Introduction to Community Bonding</h2> <p><strong>Community Bonding</strong> is arguably one of the most important phases of the Google Summer of Code. In the 2015 edition, it took place from April 27 to May …</p><p>Third Post in the GSoC 2015 series. Here I&rsquo;ll take you through the engaging community bonding experience.</p> <h2>Introduction to Community Bonding</h2> <p><strong>Community Bonding</strong> is arguably one of the most important phases of the Google Summer of Code. In the 2015 edition, it took place from April 27 to May 25. This is what the <a href="proxy.php?url=https://www.google-melange.com/gsoc/document/show/gsoc_program/google/gsoc2015/help_page">GSoC FAQ</a> has to say about this period -</p> <blockquote> <p>Students get to know mentors, read documentation, get up to speed to begin working on their projects.</p> </blockquote> <h2>About the community</h2> <p><img alt="ERAS logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/eras-logo.jpg"></p> <p>The <strong>Italian Mars Society</strong> is a highly motivated group of incredibly smart and friendly scientists and developers who share the vision of working towards manned missions to Mars. I have been interacting with the community since March 2015 and I&rsquo;ve never looked back. I was interested in the projects even during the brief period when it was unclear whether IMS would be able to participate or not. I&rsquo;m grateful to the community members for applying under the Python Software Foundation umbrella and giving students like me a brilliant opportunity to explore real world Open Source development. From what I&rsquo;ve heard, this organization comes up with the <em>coolest</em> projects for GSoC. And I concur with them - my project seems to blend in all the cool fields required for exploration - Robotics, Body-tracking, Virtual Reality, Oculus Rift, Real-time 3-D video streaming, Augmented Reality, etc.</p> <h2>Understanding the Codebase</h2> <p><img alt="Mercurial logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/mercurial.png"></p> <p>To this end, there is a stable amount of software/hardware development shared on the Bitbucket platform. While interacting with Franco and Ezio, I discovered that all students are given <strong>write access</strong> to the <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/eras/">ERAS</a> and <a href="proxy.php?url=https://bitbucket.org/italianmarssociety/v-eras-blender">V-ERAS</a> repositories using the <strong>Mercurial</strong> revision control system. This imparts tremendous responsiblity as new developers which I very much appreciate since it fosters trust and makes us mature community members. </p> <p>Going through the codebase a couple of weeks ago, I found well-documented code, almost all of which follows the PEP8 guidelines and written in Python 3. The heart of the V-ERAS project is the <a href="proxy.php?url=https://www.tango-controls.org/">Tango Controls</a> server which is a distributed device server for Supervisory Control and Data Acquistion systems. This is ideal for a complex environment like ERAS where multiple hardware and software devices like the Oculus VR, Kinect, Linux Machines, Husky Rover, and Blender Game Engine applications are involved in a distributed setup. The entire networking subsystem of ERAS is well-explained in <a href="proxy.php?url=https://erasproject.org/download/the-networking-sub-system-of-t-he-virtual-european-mar-s-analog-station-e-melotti-bachelors-thesis/">Ezio&rsquo;s thesis</a>.</p> <h2>Interacting with the Community</h2> <p>My experience with the Italian Mars Society has been memorable and pleasant right from the word go when I first entered the hallowed <strong>IMS</strong> channel of IRC (Internet Relay Chat) and introduced myself. I was promptly pointed to the right person for my project of interest. Within a single IRC Chat session with Franco, I got a clear idea of what to expect from this GSoC. The IRC channel though frequented by a small number of people is always bustling with activity. We&rsquo;ve had fruitful discussion for each and every part of the project - from software architecture diagrams in the proposal, to the collaboration between two GSoC projects, and even some fun interactions about Python software development and Mars exploration. I always appreciate the levels of responsibity and feedback that the community members muster during interacting with students. Helping my fellow GSoC aspirants and seeking help from them is always a refreshing experience. Apart from IRC and Email, I got the chance to <strong>video-conference</strong> with all the project mentors on two occasions - <em>during my GSoC interview and in the Kickoff meeting after the GSoC selection</em>. This was the first time I had a teleconference interview and I thank IMS for that. It felt more like a sincere discussion of the things that I had in mind for the GSoC project rather than a test of my skills. The trust these guys had in me let me confidently speak out my mind which helped me make my points. The big GSoC Kickoff meeting meetup took place on April 29, 2015 where we all gathered on <strong>Google Hangouts</strong> to discuss various important points for the upcoming summer of code such as -</p> <ol> <li>The importance of blogging</li> <li>Hardware/Software requirements</li> <li>Strategic timeline of events</li> <li>Software engineering guidelines</li> <li>Suggestions of joint code review sessions</li> </ol> <p>I helped prepare the meeting minutes for this session since some members faced connection problems to join the Hangout. These are shared in <a href="proxy.php?url=https://docs.google.com/document/d/1jRhBnmjlMINCjwuomrE18BPTKnUO8974-I9qexb5TfQ/edit?usp=sharing">this document</a>.</p> <h2>Setup and Technologies</h2> <p><img alt="Husky Rover" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/husky1.jpe"></p> <p>I have been exposed to an ample number of new concepts and technologies with this project. </p> <ul> <li><strong>Terrain Vehicle Rover</strong> - Clearpath Robotics&rsquo; <strong>Husky robot</strong> which is ROS-based</li> <li><strong>Microsoft Kinect Sensor</strong> for obtaining body-tracking information</li> <li><strong>Minoru 3-d webcam</strong> for stereo video streaming</li> <li><strong>Oculus Rift Development Kit 2</strong> for augmented reality applications</li> </ul> <p>To this end, I set up my workstation for the <a href="proxy.php?url=https://docs.google.com/document/d/11iE-pQ8wEX8BUwbexGgULJddv0xWRN98MYRrd0iunOI/edit?usp=sharing">project requirements</a>.</p> <p>My current machine configuration for this GSoC project is as follows:</p> <p>-Ubuntu 14.04.2 (Trusty Tahr) -ROS Indigo -Python 3 -Blender 2.74 -Tango Controls 1.99 -Linux Kernel 3.2 -Mercurial 3.4 -Hardware: 8 GB RAM, Intel Core i7 processor, Nvidia 2GB GPU GT650M</p> <p>In the last couple of weeks, I have been busy with setting up the various ROS packages which are required for <em>bodytracking based semi-autonmous teleoperation</em>. The list of ROS packages will be added to the project documentation soon.</p> <h2>Learning Experience so far</h2> <p><img alt="ffmpeg" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/ffmpeg.png"></p> <p>I&rsquo;ve learnt an unexpected great deal about a lot of different things during this project. I had to do a lot of reading to get up to speed with the existing state of V-ERAS. Franco pointed me to the <a href="proxy.php?url=https://eras.readthedocs.org/en/latest/index.html">project documentation pages</a>. I learned about Blender and Blender Game Engine after pulling an all-nighter. FFMPEG followed soon after that where I had to set up a MJPEG streaming server for the BGE client. That was followed by my first experience with PEP8, Mercurial, architecture diagrams, Tango Control system. My GSoC proposal has been an extensive piece of work with 61 revisions and brilliant feedback from my mentors. The proposal can be found <a href="proxy.php?url=https://erasproject.org/2015-gsoc/#2">here</a>. A more comprehensive description of the project is taken up in <a href="proxy.php?url=gsoc-02-project-details.md">this post</a>.</p> <p><em>To be continued&hellip;about ROS, Software Testing, Mapping, algorithms, etc</em></p>Cracking the Google Summer of Code2015-05-16T00:53:52+00:002015-05-16T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-05-16:/letters/cracking-the-google-summer-of-code/<p>Hopped into the IRC chat. Showed grit. Prepared prototypes, drafted documents. Wonderful things happened.</p>GSoC '15 - About my Project2015-05-07T00:53:52+00:002015-05-07T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-05-07:/letters/gsoc-15-about-my-project/<p>Second Post in the GSoC 2015 series. This post is intended to explain my project proposal.</p> <p>The project proposal that I submitted can be found <a href="proxy.php?url=https://erasproject.org/2015-gsoc/#2">here</a>.</p> <p><img alt="ERAS Station" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/eras-station.jpg"> <em>to be continued&hellip;</em></p>GSoC 2015 with the Italian Mars Society2015-04-29T00:53:52+00:002015-04-29T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-04-29:/letters/gsoc-2015-with-the-italian-mars-society/<p><img alt="GSoC Banner" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/gsoc-banner.png"></p> <p>I got accepted into the eleventh edition of the <strong>Google Summer of Code</strong> program (<a href="proxy.php?url=https://www.google-melange.com/gsoc/homepage/google/gsoc2015">GSoC 2015</a>) with the <strong>Python Software Foundation</strong> umbrella organization. The list of selected students was announced on 28th April, 2015.</p> <p><img alt="Python Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/python-logo.png"></p> <p>More specifically, I&rsquo;ll be working with the Italian Mars Society under the ERAS (European MaRs …</p><p><img alt="GSoC Banner" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/gsoc-banner.png"></p> <p>I got accepted into the eleventh edition of the <strong>Google Summer of Code</strong> program (<a href="proxy.php?url=https://www.google-melange.com/gsoc/homepage/google/gsoc2015">GSoC 2015</a>) with the <strong>Python Software Foundation</strong> umbrella organization. The list of selected students was announced on 28th April, 2015.</p> <p><img alt="Python Logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/python-logo.png"></p> <p>More specifically, I&rsquo;ll be working with the Italian Mars Society under the ERAS (European MaRs Analogue Station) project. Quoting from the <a href="proxy.php?url=https://erasproject.org/">source</a> -</p> <blockquote> <p>The European MaRs Analogue Station for Advanced Technologies Integration (ERAS) is a program spearheaded by the Italian Mars Society (IMS) which main goal is to provide an effective test bed for field operation studies in preparation for manned missions to Mars.</p> </blockquote> <p><img alt="ERAS logo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/eras-logo.jpg"></p> <p>The focus of this GSoC project is <strong>Virtual Reality based Telerobotics</strong> for V-ERAS.</p> <p><strong>Virtual European Mars Analog Station (V-ERAS)</strong> is based on immersive real-time environment simulations running on top of the Blender Game Engine (BGE).</p> <p>This project has three distinct components - </p> <ol> <li> <p>A <strong>ROS-Kinect</strong> interface for the Teleoperative control of the Clearpath Husky Robot rover&rsquo;s motion via human body-tracking.</p> </li> <li> <p><strong>Streaming the 3-D stereo camera video feed</strong> from the rover to BGE over the network.</p> </li> <li> <p>Processing the video feed into an <strong>Augmented Reality</strong> experience through a <strong>head-mounted Virtual Reality device</strong>. </p> </li> </ol> <p>The goal of this V-ERAS project is thus to develop a software and hardware system that enhances the capabilities of the crew members preparing for Mars missions.</p> <p>I feel elated to be a part of the Italian Mars Society and be able to contribute towards manned space exploration which is one of the vital aims of the next two decades. GSoC marks my first foray into the world of collaborative Open Source software development.</p> <p>I shall be mentored by two really cool people - <a href="proxy.php?url=https://il.linkedin.com/in/yuvalbrodsky">Yuval Brodsky</a> and <a href="proxy.php?url=https://plus.google.com/105053384149339279492/posts">Fabio Nigi</a> with whom I share my interests in space exploration, robotics, networks, and free software. In addition, I&rsquo;ll be constantly interacting with the IMS-ERAS community - <a href="proxy.php?url=in.linkedin.com/pub/franco-carbognani/3/998/145">Franco Carbognani</a>, Ezio Melotti, Mario Tambos, Ambar Mehrotra, Shridhar Mishra, Vito Gentile. </p> <p>Thank you Google for this unique birthday gift :)</p> <p>Looking forward to a great and challenging summer of Code!</p> <p>I&rsquo;ll share the details of the project in the next post in this series.</p>GSoC RSS Feed Test Post2015-04-16T18:10:52+05:302015-04-16T18:10:52+05:30Siddhant Shrivastavatag:sidcode.github.io,2015-04-16:/letters/gsoc-rss-feed-test-post/<p>Much GSoC. So RSS. Very Python :)</p> <p>This is a test blog post to check if the <strong>atom.xml</strong> for the category <strong>GSoC</strong> works or not. Python Software Foundation motivates its students to blog at least once in every two weeks (stating the frequency of posting in clear terms because biweekly …</p><p>Much GSoC. So RSS. Very Python :)</p> <p>This is a test blog post to check if the <strong>atom.xml</strong> for the category <strong>GSoC</strong> works or not. Python Software Foundation motivates its students to blog at least once in every two weeks (stating the frequency of posting in clear terms because biweekly can be ambiguous sometimes). </p> <p>Blogging the <strong>developments</strong> is essential for any constructive task; in my case the task is Open Source Software development using Python.</p> <h2>Why Blog?</h2> <blockquote> <p>Developments entail success, setbacks (interestingly <em>failure</em> is not used in this industry because nobody ever fails), issues, progress, discussion on design aspects and learning.</p> </blockquote> <h2>Update(29th April, 2015)</h2> <p>The list of accepted students was announced yesterday(amazingly coinciding with my birthday). I have been accepted for the Google Summer of Code program as a student under the Python Software Foundation umbrella with the organization - Italian Mars Society. A detailed post on my acceptance can be <a href="proxy.php?url=https://sidcode.github.io/letters/gsoc-2015-with-the-italian-mars-society/">found here</a>.</p>Notes for this blog2015-04-01T00:53:52+00:002015-04-01T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-04-01:/letters/notes-for-this-blog/<p>List - 0. GSoC as of now (1st May 2015) 1. Disqus comments integration 2. About me page 3. Blog name 4. 1pad - a post every two days 5. On Typeracing 6. On Movies and Literature 7. On writing. 8. on algorithms. 9. on research. 10. on mit media lab 11 …</p><p>List - 0. GSoC as of now (1st May 2015) 1. Disqus comments integration 2. About me page 3. Blog name 4. 1pad - a post every two days 5. On Typeracing 6. On Movies and Literature 7. On writing. 8. on algorithms. 9. on research. 10. on mit media lab 11. on linux (arch), etc.</p>Tonight's Robot : TARS from Interstellar2015-03-16T18:10:52+05:302015-03-16T18:10:52+05:30Siddhant Shrivastavatag:sidcode.github.io,2015-03-16:/letters/tonights-robot-tars-from-interstellar/<p>With this post, I&rsquo;ve decided to flag off a new blog series, <strong>Tonight&rsquo;s Robot </strong> whose raison-de-etre are the robots from real life research and fiction; what they can teach us about Robotics and its allied fields (Human-Robot Interaction, Artificial Intelligence, Roboethics, etc).</p> <p>So this post is about the …</p><p>With this post, I&rsquo;ve decided to flag off a new blog series, <strong>Tonight&rsquo;s Robot </strong> whose raison-de-etre are the robots from real life research and fiction; what they can teach us about Robotics and its allied fields (Human-Robot Interaction, Artificial Intelligence, Roboethics, etc).</p> <p>So this post is about the U.S Marine Corps robot commanders from the 2014 blockbuster <em>Interstellar</em> named TARS, CASE, and KIPPS.</p> <h1>Robots of Interstellar (2014)</h1> <p><img alt="interstellar" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/tars.jpg"> <em>Picture by Jeorge B. George</em></p> <h1>Here&rsquo;s what these KitKat bar shaped Bots teach us:</h1> <h2>1. Companionship</h2> <blockquote> <p>The space missions on Interstellar were decades-long (yeah, relativity). In this scenario, with crew commanders single-manning a ship and spending years with no human around - things can get quite monophobic. The Artificially Intelligent and almost sentient robots shown in Interstellar showcase <em>features</em> like empathy, identification of needs, sarcasm, humour, honesty; just like one would expect a human to be.</p> </blockquote> <h2>2. One with the humans</h2> <blockquote> <p>The robots actually spend all their time with humans or other robots. They are as important as the human flight commanders. Equal responsibility and equal capability sans the Relativistic Math skills which Cooper didn&rsquo;t possess. <em>Treatment as equals</em> was something I covered in the post on <em>Big Hero 6&rsquo;s</em> <strong>Soft Intelligent Bots</strong> article as well. The robots of the future <strong>would not</strong> be limited to the <strong>4Ds</strong> - <em>Dangerous</em>, <em>Dirty</em>, <em>Daunting</em>, <em>Dull</em> but would cohabitate with humans as equals.</p> </blockquote> <h2>3. Human-Centric, not Robo-Centric</h2> <p><img alt="baymax meets tars" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/baymaxtars.jpg"></p>Soft Intelligent Bots2015-03-11T00:53:52+00:002015-03-11T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-03-11:/letters/soft-intelligent-bots/<blockquote> <p>Disney doesn&rsquo;t always make a robot movie, but when it does it creates a new yardstick for scientific research altogether.</p> </blockquote> <h1>Big Hero 6 - Disney (2014)</h1> <p><img alt="Baymax" src="proxy.php?url=https://sidcode.github.io/letters/soft-intelligent-bots/bighero.png"></p> <p>This post is not a run-of-the-mill regular weekly thing I do. I happened to stumble over <em>Big Hero 6</em>, a movie based on an …</p><blockquote> <p>Disney doesn&rsquo;t always make a robot movie, but when it does it creates a new yardstick for scientific research altogether.</p> </blockquote> <h1>Big Hero 6 - Disney (2014)</h1> <p><img alt="Baymax" src="proxy.php?url=https://sidcode.github.io/letters/soft-intelligent-bots/bighero.png"></p> <p>This post is not a run-of-the-mill regular weekly thing I do. I happened to stumble over <em>Big Hero 6</em>, a movie based on an adolescent roboticist&rsquo;s life (enough to get me hooked). They say <em>serendipity</em> is a good thing. And they couldn&rsquo;t be more right. What are the odds of bumping into a life-changing movie after watching Apple&rsquo;s latest keynote presentation of the new MacBook (tactile trackpad) and the new medical research platform - <a href="proxy.php?url=https://www.apple.com/researchkit/">ResearchKit</a>.</p> <h1>The things I love about <strong>Big Hero 6</strong></h1> <h2>1. Inspired from Robotics research, not the other way round</h2> <blockquote> <p>The film&rsquo;s producers trailed a non-existent path by learning from the research on <em>Human-Robot interaction</em> and <em>Soft Robotics</em> at <strong>Robotics Institute, Carnegie Mellon University</strong>.</p> </blockquote> <h2>2. Bot-fighting</h2> <blockquote> <p><strong>Real Steel</strong> is passe now. &lsquo;Big Hero 6&rsquo; starts with underground bot-fights in a conceptualized fusion town - San Fransokyo. And who wins the bot-fight. Not the strongest or goriest bot, but the agile bot with <em>magnetic-bearing servos</em>.</p> </blockquote> <h2>3. Puberty</h2> <blockquote> <p>Hiro Hamada - the protagonist wunderkind takes a sloppy decision multiple times, goes through mood-swings. He has the same tragic flaws that all of us can connect with. He almost loses his mind over making a career in the winning-money-comes-easy bot fights. This is when the role of his elder brother - Tadashi kicks in. And boy does his short-lived (no pun) role inspire all of us to push the boundaries of Robotics.</p> </blockquote> <h2>4. Parenting done right.</h2> <blockquote> <p>Hiro wouldn&rsquo;t be the humble prodigy if not for his brother Tadashi and his aunt Cass. The film dabbles with the myriad aspects of teenage and parenting - scolding, concern, care, love, help, unconditional support. The elder brother is a parent to him - guiding Hiro on the right path whenver he goes wayward....till the very end. Tadashi doesn&rsquo;t die. <em>Tadashi is here.</em> - says Baymax multiple times in the movie. And till the end, Tadashi lives through Baymax and Hiro. Later in the movie, it becomes quite prominent how Baymax becomes a parent to Hiro - Baymax to Hiro <em>&ldquo;Seatbelts save lives. Always fasten them.&rdquo;</em></p> </blockquote> <h2>5. Robotics is hard, and rewarding</h2> <blockquote> <p><em>&ldquo;I would lose my mind if I don&rsquo;t apply here&rdquo;</em> - says Hiro when Tadashi gives him a trip of his university lab. Flying cats, Industrial Manipulators playing Table Tennis, Electro-magnetic suspension on bike wheels, Laser-induced plasma, 3-d printed carbon fibre armours, Body-tracking, Chemical Metal embrittlement - <strong>this film inspired the maker in me</strong>. It beats even Iron Man after a point, which by the way is/was my guide to Robotics. And the <em>magnum opus</em> - <strong>Baymax</strong> - your personal health care assistant. He is the star of the film, through and through.</p> </blockquote> <h2>6. Soft Robotics</h2> <blockquote> <p>The film gets infinite brownie points for treading into the territory of Soft Actuated Robotics. The huggable inflatable Robot that can lift a 1000 pounds - that is what the future should be about - not metallic humanoids. This field of study is only beginning to emerge, and using a concept like Soft Robotics with the use case of a healthcare robot - <strong>the filmmakers have done a fascinating outstanding job</strong>.</p> </blockquote> <h2>7. The Maker spirit</h2> <blockquote> <p>The MIT Media Lab Design Innovation workshop changed the way I look at hacking. It is about identifying problems, storyboarding ideas, thinking of solutions, and implementing them as efficiently as possible. Hiro Hamada, the quintessential maker - sees a need, and fulfills it in his garage. I would definitely have a garage like that soon.</p> </blockquote> <h2>8. Swarm Robotics</h2> <blockquote> <p>The cuteness of the Soft Robot is challenged by grey goo - neurotransmitter controlled <strong>Microbots</strong> developed by Hiro for the Robotics exhibition. As Hiro comments - &ldquo;The applications for this tech are <strong>limitless</strong>; the only limit is your imagination.&rdquo; He says it right. But a villainous imagination can only flip cars and kill people. And that&rsquo;s what happens. Hiro battles this evil with his friends and Baymax till the very end.</p> </blockquote> <h2>9. Robots treated as equals</h2> <blockquote> <p>The title of the movie is Big Hero <strong>6 </strong>. This includes the 5 humans and Baymax, the film is a great step forward in Roboethics. <strong>Our programming prevents us from injuring a human being</strong>.</p> </blockquote> <h2>10. Expectations from a Health Care robot</h2> <blockquote> <p><strong>&ldquo;I fail to see how this makes me a better healthcare companion &ldquo;</strong>. Baymax voices this concern multiple times with his intimidating upgrades. The movie follows the lines of <strong>Robot and Frank</strong> where a personal robot attendant understands the needs of its user - right or wrong. Baymax is reticent when it comes to following harm-causing orders. From this, we can learn that AI can be controlled, and need not result in singularity. Even if it does result, the robots would be intelligent enuough to keep the humans that are good at heart.</p> </blockquote> <p>So here&rsquo;s to the robots of today and tomorrow - who can feel just like a human - <strong>social robotics</strong> and human-robot interaction are potential early adopters of the developments that sprout from this research.</p> <p>Here are some Social Robots that might interest you:</p> <h1>Jibo - MIT Media Lab</h1> <p><img alt="Jibo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/jibo.jpg"></p> <h1>Romeo - Aldebaran Robotics</h1> <p><img alt="Romeo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/romeo.jpg"></p>My first experience with Pelican2015-02-17T11:52:00+08:002015-02-18T10:57:00+08:00Siddhant Shrivastavatag:sidcode.github.io,2015-02-17:/letters/my-first-experience-with-pelican/<p>I am trying to set up Pelican here.</p> <p>Edit: The setup process was so much easier than Octopress that I could complete manual migration in less than half an hour.</p> <p>Thanks to the plugins and theme support, everything that a blogger might need is simply <em>battery included</em>.</p> <p>So here is …</p><p>I am trying to set up Pelican here.</p> <p>Edit: The setup process was so much easier than Octopress that I could complete manual migration in less than half an hour.</p> <p>Thanks to the plugins and theme support, everything that a blogger might need is simply <em>battery included</em>.</p> <p>So here is a list of resources that I found indispensable:</p> <ol> <li> <h2>Octopress theme</h2> </li> </ol> <blockquote> <p>The readablity and functionality offered by this minimalist theme is brilliant.</p> </blockquote> <ol start="2"> <li> <h2>Social Media Integration</h2> </li> </ol> <blockquote> <p>The configuration files <em>pelicanconf.py</em> and <em>publishconf.py</em> have ample variables for linking with the popular social media platforms - Facebook, Twitter, G+.</p> </blockquote> <ol start="3"> <li> <h2>Google Analytics</h2> </li> </ol> <blockquote> <p>Open Source bloggers can use this to track hits and get visual visits-with-time feedback.</p> </blockquote> <ol start="4"> <li> <h2>Markdown</h2> </li> </ol> <blockquote> <p>I like this typesetting format for multiple reasons - touchtyping friendly characters, frugal use of tags, expressiveness, and the LaTeX-like non-WYSIWYG nature.</p> </blockquote>BeautifulSoup for Coursera2015-02-17T00:53:52+00:002015-02-17T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-02-17:/letters/beautifulsoup-for-coursera/<p>In a significant bandwidth-saving move, the MOOC organization <strong>Coursera</strong> <em>removed</em> the downloadable links from its course videos page. Here are the implications:</p> <h2>Earlier</h2> <blockquote> <p>Use the DownThemAll Firefox plugin/coursera-dl/wget to get all links to all course videos and start downloading.</p> </blockquote> <h1>Current Scenario</h1> <blockquote> <p>The student has to take a scattershot …</p></blockquote><p>In a significant bandwidth-saving move, the MOOC organization <strong>Coursera</strong> <em>removed</em> the downloadable links from its course videos page. Here are the implications:</p> <h2>Earlier</h2> <blockquote> <p>Use the DownThemAll Firefox plugin/coursera-dl/wget to get all links to all course videos and start downloading.</p> </blockquote> <h1>Current Scenario</h1> <blockquote> <p>The student has to take a scattershot approach through the UI maze to reach the online video player. There is no single page with all downloadable links.</p> </blockquote> <h2>How does this affect the MOOC user?</h2> <blockquote> <ol> <li>Courses will be chosen wisely</li> <li>Watch only what you need.</li> <li>Salvaged Bandwidth - content loads faster</li> </ol> </blockquote> <h2>Workaround</h2> <p>I am trying to learn and use the Python-based parsing library <strong>BeautifulSoup</strong> to parse the html pages and extract the video links at one place so that the tools like <em>DownThemAll</em> become instrumental to this end again.</p> <blockquote> <p>It&rsquo;s only a matter of time that this impending inconvenience be resolved - await the nifty hacks!</p> </blockquote> <p>Till then, Ta!</p>Human Sensor Network and Fusion2015-02-15T00:53:52+00:002015-02-15T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-02-15:/letters/human-sensor-network-and-fusion/<p>In the true spirit of <em>Sweat and Neurons</em>, this rambling mindgasm is about an exquisite biosensor and intelligent network - a living organism&rsquo;s sensory and nervous systems.</p> <ol> <li><strong>Proactive vs Reactive systems</strong></li> </ol> <blockquote> <p><em>Sensory</em> systems are reactive whereas the <em>nervous system</em> exhibits reinforcement - it learns from its past and grows stronger on …</p></blockquote><p>In the true spirit of <em>Sweat and Neurons</em>, this rambling mindgasm is about an exquisite biosensor and intelligent network - a living organism&rsquo;s sensory and nervous systems.</p> <ol> <li><strong>Proactive vs Reactive systems</strong></li> </ol> <blockquote> <p><em>Sensory</em> systems are reactive whereas the <em>nervous system</em> exhibits reinforcement - it learns from its past and grows stronger on positive feedback - that is, it is proactive to future decisions. </p> </blockquote> <ol start="2"> <li><strong>How the systems evolved?</strong></li> </ol> <blockquote> <p>Evolution (artificial and natural selection, genetic aberrations) compounded by the survival of the fittest, most agile (reactive), and most adaptive (proactive) species caused these systems to reach to the point where they place currently.</p> </blockquote> <ol start="3"> <li><strong>Where are these systems headed?</strong></li> </ol> <blockquote> <p>Research on sensor networks(wired/wireless), Cellular Automata and Artificial Intelligence suggest our disposition towards making an electronic human. Evolution might have to struggle with the same hiccups like sensor resolution, power requirements, accuracy, layout, response time, data fusion that researchers struggle today.</p> </blockquote> <p>So here&rsquo;s to the robots of tomorrow - who can feel just like a human - <strong>social robotics</strong> and human-robot interaction are potential early adopters of the developments that sprout from this research.</p> <p>Here are some Social Robots that might interest you:</p> <h1>Jibo - MIT Media Lab</h1> <p><img alt="Jibo" src="proxy.php?url=https://sidcode.github.io/images/articles/2015/jibo.jpg"></p>Identifying Bots from above2015-02-13T00:53:52+00:002015-02-13T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-02-13:/letters/identifying-bots-from-above/<p>After a firm resolve to abstinate from using C++ for any programming project, I am back to square one - this time with a Robotics research project. The team at The Robotics lab is creating a testbed for swarm algorithms on epuck bots. This entails localization and communication. I have chosen …</p><p>After a firm resolve to abstinate from using C++ for any programming project, I am back to square one - this time with a Robotics research project. The team at The Robotics lab is creating a testbed for swarm algorithms on epuck bots. This entails localization and communication. I have chosen to ID each of the bots while moving via a QR code. This will be done using OpenCV and C++. Learnt about -</p> <ul> <li>CMake</li> <li>g++ compatibility and flags</li> <li>OpenCV warping and contours</li> <li>Cascade Classifiers</li> </ul> <p>The <code>zbar</code> library for linux makes decoding qr codes a breeze. All this while, figuring out the perfect library for this job was a red herring. It even turns out that ROS has its own wrapper library - <strong>roszbar</strong> to use the features of zbar as a ROS node, thus extending the scope of the lab to work with our TurtleBot much easier.</p> <p>Here&rsquo;s to visually identifying robots in a lab.</p>Memcomputing is here to stay2015-02-05T00:53:52+00:002015-02-05T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-02-05:/letters/memcomputing-is-here-to-stay/<p>Competitive Programming is about to perish. Information Processing and Storage at the same space and time. </p> <p>Now that I have your undivided attention, it would be worth the spacetime to take a look at the research being conducted on <strong>Memcomputing</strong>. </p> <blockquote> <p>Memcomputing is a theorized notion introduced in the 70&rsquo;s …</p></blockquote><p>Competitive Programming is about to perish. Information Processing and Storage at the same space and time. </p> <p>Now that I have your undivided attention, it would be worth the spacetime to take a look at the research being conducted on <strong>Memcomputing</strong>. </p> <blockquote> <p>Memcomputing is a theorized notion introduced in the 70&rsquo;s which is currently being realized by nanoparticles. </p> </blockquote> <p>It utilizes the features of special materials on the nanoscale - memcomputers, memresistors, memcapacitors, meminductors.</p> <p>The recent arXiv paper on <em>Universal Memcomputing Machines</em> convinces me that Moore&rsquo;s law is here to stay. It surpasses the von Neumann bottleneck that code and data have to be stored at different places but still depend on each other. Memory devices like those mentioned above maintain a history of the current, charge, and flux respectively. This enables them to act like normal electric components as well as instill them with the properties of a transistor.</p> <blockquote> <p>Alan Turing would be so proud!</p> </blockquote>Life is a 2-D board game!2015-02-04T00:53:52+00:002015-02-04T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-02-04:/letters/life-is-a-2-d-board-game/<p><strong>Conway&rsquo;s Game of Life</strong> or the single-player zero sum game comprised of cellular automata with four simple rules. It was proposed by John Conway as a <em>Scientific American</em> article and not very long ago, it reassured me of the beauty of Mathematics. </p> <blockquote> <p>The game has intrigued me for quite …</p></blockquote><p><strong>Conway&rsquo;s Game of Life</strong> or the single-player zero sum game comprised of cellular automata with four simple rules. It was proposed by John Conway as a <em>Scientific American</em> article and not very long ago, it reassured me of the beauty of Mathematics. </p> <blockquote> <p>The game has intrigued me for quite a long time.</p> </blockquote> <p>I got the opportunity to dabble with the intricate patterns and get to know the rules when I alter the rules. The results are to say the least, quite disastrous. The conditions for Conway&rsquo;s game begin to crumble down with simple changes in the rules. Simulations under way.</p> <p>I am thinking along the lines of applying the Game of Life in <strong>Pattern Formation problems in Robotics</strong>. For its automaton-ness&hellip;no history, no future. Adaptability to future outcome, modification of rules as and when you like, the tenacity of the patterns. Oscillators, still lifes all can be used in different domains of Robotics. </p> <blockquote> <p>An entire world of Robots can be imagined where all the different kinds of bots follow the rules of the game of life.</p> </blockquote> <p>This not only controls the population, but also results in self-sufficient communities. This can teach us things about Android epistemology. For instance, a robot meant for video surveillance would not move, hence lives a Still Life. A robot meant to patrol a region frequently should be an Oscillator. The simple states of the Automata ensure crispness of pattern formation.</p> <p>However it would be a challenge to deploy robots to the next states in the Automata, ensure which robots to move where, which robots to keep, which robots to summon. Virtually making a cell dead even when a robot lives there.</p> <p>Cellular Automata would prove beneficial in self-organizing systems.</p> <p>People tried the game with QR codes.</p> <p>Fun Fact: I was introduced to Conway&rsquo;s Game via a strategy board game with the same name - &ldquo;Game of Life&rdquo;. </p>The LaTeX course of life2015-01-10T16:42:14+00:002015-01-10T16:42:14+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-01-10:/letters/the-latex-course-of-life/<p>Summer internships&rsquo; deadlines knell by as I key this post. A common section in all application forms is the Curriculum Vitae/Résumé. Having made 2 versions of my résumé in Microsoft Word, the FOSS enthusiast in me felt the pinch. </p> <h2>I decided- my résumé should be in LaTeX - the typesetting …</h2><p>Summer internships&rsquo; deadlines knell by as I key this post. A common section in all application forms is the Curriculum Vitae/Résumé. Having made 2 versions of my résumé in Microsoft Word, the FOSS enthusiast in me felt the pinch. </p> <h2>I decided- my résumé should be in LaTeX - the typesetting framework based on Knuth&rsquo;s TeX.</h2> <p>Having just a theoretical knowledge of how LaTeX worked, I probed into the open-source community. Forked Debarghya Das&rsquo; résumé which I liked because of its brevity and summarization in just one page.</p> <p>This exercise taught me the importance of documentation, yet again. I was not merely modifying someone&rsquo;s LaTeX code. The documentation was edifying and I actually learnt a lot in the process.</p> <p>Ok, enough said. I urge you to try out LaTeX if you haven&rsquo;t already. It has a steep learning curve for certain - but I am sure the hacker in you will be happy to help you climb the TeXy mountain.</p> <p>Alright, back to completing my application forms :) </p>I like my factors Prime and and my numbers Random2015-01-10T16:07:04+00:002015-01-10T16:07:04+00:00Siddhant Shrivastavatag:sidcode.github.io,2015-01-10:/letters/i-like-my-factors-prime-and-and-my-numbers-random/<p>A Quadratic Sieve is currently factorizing 7393913335919140050521110339491123405991919445111971 as I write this post. And the choice of this 52-digit number is not random.</p> <blockquote> <p>The largest prime factor of this number will lead me to the next level in hacker.org challenges.</p> </blockquote> <p>I am currently working on creating the Python bindings to …</p><p>A Quadratic Sieve is currently factorizing 7393913335919140050521110339491123405991919445111971 as I write this post. And the choice of this 52-digit number is not random.</p> <blockquote> <p>The largest prime factor of this number will lead me to the next level in hacker.org challenges.</p> </blockquote> <p>I am currently working on creating the Python bindings to the PCG family of Random-number generators. A <a href="proxy.php?url=www.pcg-random.org">recent approach</a> claims to be perform significantly well over other PRNGs across different statistical metrics.</p> <p>hacker.org teaches me a lot of nifty tricks in my favorite language of late - Python.</p> <p>To know more about all the bit-level twiddling stuff, refer to <strong>Hacker&rsquo;s Delight</strong> by Henry Warren Jr.</p>A Computing Era gone by...2014-06-28T14:55:17+05:302014-06-28T14:55:17+05:30Siddhant Shrivastavatag:sidcode.github.io,2014-06-28:/letters/a-computing-era-gone-by/<p>Being a 90s person, mine was one of the first generation adopters of intuitive Graphical User Interfaces. It took me 18 years to properly lose my Unix virginity. It not only brought a plethora of hacking skills, but also a deep appreciation of our rich computer heritage. Computer Science is …</p><p>Being a 90s person, mine was one of the first generation adopters of intuitive Graphical User Interfaces. It took me 18 years to properly lose my Unix virginity. It not only brought a plethora of hacking skills, but also a deep appreciation of our rich computer heritage. Computer Science is the best thing to happen to the world in the last century. Turing had a hunch - to replace the scores of women &lsquo;computers&rsquo; calculating away during the World Wars, with a hypothetical machine that follows instructions and operates on data - continuously maintaining one of its several states.</p> <p>We have come a long way since. The Computer History Museum does an awesome job at preserving the heroics of the people who contrived contraptions that made life easier.</p> <p>Microsoft dominated the Unix market with its licensed platform - Xenix. DOS is heavily inspired by Unix. But then they split ways.</p> <p>Much more to be added in this post. Keep looking for the buzz.</p> <p>https://en.wikipedia.org/wiki/Computer_terminals https://en.wikipedia.org/wiki/Teletype_Model_33</p> <p>Refer : Unix Prog. Environment by Kernighan and Pike The Man who knew too much - Life of Alan Turing</p>The Tree of Life2014-06-16T22:43:56+00:002014-06-16T22:43:56+00:00Siddhant Shrivastavatag:sidcode.github.io,2014-06-16:/letters/the-tree-of-life/<p>Watching Cosmos [S01E02]. Neil deGrasse Tyson speaks about artificial and natural selection ( dog breeds, and polar bears; respectively ). He also talks that we share our DNA with plants, and furthermore, plants and humans share DNA with birds, etc..So we can backtrack in a similar inductive fashion to reason that …</p><p>Watching Cosmos [S01E02]. Neil deGrasse Tyson speaks about artificial and natural selection ( dog breeds, and polar bears; respectively ). He also talks that we share our DNA with plants, and furthermore, plants and humans share DNA with birds, etc..So we can backtrack in a similar inductive fashion to reason that life started as ONE COMMON ancestor.</p> <p>What I think? We have ML, and DM backtracking mechanisms available. Why not think in this direction, that we can establish a correlation between different species, a coming together of various kingdoms, in a way that both explains and justifies evolution - both in the artificial, and the natural way.</p> <p>My hunches are sometimes right, and mostly overstated. But we can try, right? It&rsquo;ll teach us a lot of things. Firstly it&rsquo;ll make us better at looking at ourselves, at the world as one great family (that&rsquo;s what Vasudev Kutumb is all about, right?)</p> <p>If we can use backtracking to predict disease vectors, and weather conditions (from the IIRS orientation session), I believe we can collectively develop a genealogy project for species all over.</p> <p>After all, why am I learning all the tools to process data, and to code? To make the world a better place, right?</p> <p>Remember, we don&rsquo;t need to save the planet, the planet is strong enough to sustain itself; we need to save ourselves, from the disasters we can cause artificially.</p>From GitHub pages to sidcode.github.io2014-06-14T00:53:52+00:002014-06-14T00:53:52+00:00Siddhant Shrivastavatag:sidcode.github.io,2014-06-14:/letters/from-github-pages-to-sidcodegithubio/<p>Being a total amateur when it comes to Domain Names, TLDs,<code>CNAME</code> records, and selecting a service from the brutal competition among Domain/Host providers; this journey has been exhausting, time consuming, but essentially important.</p> <p>Basically everything boils down to communication between two computers in the world. Computers on the …</p><p>Being a total amateur when it comes to Domain Names, TLDs,<code>CNAME</code> records, and selecting a service from the brutal competition among Domain/Host providers; this journey has been exhausting, time consuming, but essentially important.</p> <p>Basically everything boils down to communication between two computers in the world. Computers on the World Wide Web (the Internet as we know it - ignoring deep web for now), are addressed by a set of numbers known as IP (Internet Protocol) addresses. Just like your home address, this helps all the computers in the world to know where &ldquo;on the network&rdquo;, your computer/ your domain provider&rsquo;s computer is. IP addresses are generally of the form &ldquo;192.30.252.153&rdquo;, in the IPv4 scheme (read more about Combinatorial Explosion and IP in this post&rsquo;s footnotes ).</p> <p>Since memorizing numbers for humans is considered too brain-intensive, we delegate this important task to computers, and create human-friendly names in Natural Languages (like English, Arabic, Hindi, Mandarin etc). For example google.com runs on a lot of computers and one such address is 173.194.36.5. Copy paste this number in your browser&rsquo;s address bar and you&rsquo;ll reach google.com :D . How do I know about this number? There are a number of ways to find it out. I used:</p> <div class="highlight"><pre><span></span><code>ping google.com </code></pre></div> <p>which resulted in</p> <div class="highlight"><pre><span></span><code>google.com. 91 IN A 173.194.36.7 google.com. 91 IN A 173.194.36.4 google.com. 91 IN A 173.194.36.5 google.com. 91 IN A 173.194.36.2 google.com. 91 IN A 173.194.36.3 google.com. 91 IN A 173.194.36.14 google.com. 91 IN A 173.194.36.0 google.com. 91 IN A 173.194.36.1 google.com. 91 IN A 173.194.36.6 google.com. 91 IN A 173.194.36.9 google.com. 91 IN A 173.194.36.8 </code></pre></div> <p>ping is a useful utility to test connections over any type of network topology - I&rsquo;ll be using it more frequently when I talk about my Raspberry Pi setup over a VNC server. (There also I&rsquo;ll have to create a Static IP)</p> <p>So getting back to IP address mapping to English language words - henceforth referred to as &ldquo;domain names&rdquo;. People like you and me, have the choice to select a name of our choice (similar in spirit to christening children, naming a company, or naming anything in general). Of course, with choice comes competition - and more often than not, the domain name that we want is already chosen by somebody else (remember it has to be unique unlike human names). In my case, siddhant.org was already taken. Since this blog focuses on the confluence of Computer Science and Everything, I decided to append a &ldquo;sci&rdquo; to my name, and hence this domain is www.sidcode.github.io.</p> <p>Initially this site was hosted on GitHub Pages i.e at sidcode.github.io . I always wanted a custom domain name ( David Malan sir&rsquo;s CS50 lectures also motivated me to get one :D ).</p> <p>So I went domain service provider hunting. Having just a brief idea about some providers like GoDaddy - I was shocked at the intense competition in this domain (pun unintended). With every other company claiming about features like &ldquo;Unlimited Bandwidth&rdquo;, &ldquo;Unlimited Domains&rdquo;, &ldquo;Unlimited email addresses&rdquo;, and what not - I got wayward, and lost. Of course the limitlessness provided by these providers is an illusion for the people hosting their site on their servers. Servers are simply normal computers running dedicated software to receive incoming connections, and relay outputs according to the requests. And they have to run ALL THE TIME, otherwise the site would go down - since there would be no computer at the receiver&rsquo;s end to reply. But I realized that I don&rsquo;t need a special server on these hosting solution providers - since this blog would be hosted on GitHub pages. All I needed was a domain name to which I can point this blog to.</p> <p>Thus I just bought the domain name for a year (the damage was around 13$) on HostGator.</p> <p>I modified the DNS records to point to GitHub&rsquo; IP addresses provided on https://help.github.com/articles/setting-up-a-custom-domain-with-github-pages . So I created an <code>A record</code> to point from the host(sidcode.github.io) to the static IPs provided by GitHub. On GitHub&rsquo;s side I created a <code>CNAME</code> file with just the content - &ldquo;www.sidcode.github.io&rdquo; in it. This essentially aliased sidcode.github.io to sidcode.github.io.</p> <p>Now I have to wait for another day to get the two computers of the world, and the Domain Management System to acknowledge this mapping, before you can access this site. If you are reading this, it means that everything is properly configured.</p>Lessons from the Octopress Setup2014-06-11T23:37:37+00:002014-06-11T23:37:37+00:00Siddhant Shrivastavatag:sidcode.github.io,2014-06-11:/letters/lessons-from-the-octopress-setup/<p>This is my first post on this blog. I am learning to play with the various commands and get a knack of the basic publishing workflow.</p> <div class="highlight"><pre><span></span><code>This is how syntax highlighting is done </code></pre></div> <p>After creating <code>.markdown</code> files, all one has to do is to push changes to the master branch …</p><p>This is my first post on this blog. I am learning to play with the various commands and get a knack of the basic publishing workflow.</p> <div class="highlight"><pre><span></span><code>This is how syntax highlighting is done </code></pre></div> <p>After creating <code>.markdown</code> files, all one has to do is to push changes to the master branch of the remote repo:</p> <div class="highlight"><pre><span></span><code><span class="nv">bundle</span><span class="w"> </span><span class="k">exec</span><span class="w"> </span><span class="nv">rake</span><span class="w"> </span><span class="nv">generate</span><span class="w"></span> <span class="nv">bundle</span><span class="w"> </span><span class="k">exec</span><span class="w"> </span><span class="nv">rake</span><span class="w"> </span><span class="nv">deploy</span><span class="w"></span> </code></pre></div> <p>Git commands to be used:</p> <p>To update the <code>source</code> branch with the latest changes (it is recommended to keep a backup)</p> <div class="highlight"><pre><span></span><code>git add . git commit -m &quot;source update #&quot; git push origin source </code></pre></div> <p>whereas to revert to the master branch, go to the _deploy directory, and run:</p> <div class="highlight"><pre><span></span><code><span class="nv">git</span><span class="w"> </span><span class="nv">pull</span><span class="w"> </span><span class="nv">origin</span><span class="w"> </span><span class="nv">master</span><span class="w"></span> <span class="nv">bundle</span><span class="w"> </span><span class="k">exec</span><span class="w"> </span><span class="nv">rake</span><span class="w"> </span><span class="nv">deploy</span><span class="w"></span> </code></pre></div> <p>To-do:</p> <ol> <li>Add pages: about, travel, competitive programming, open source contributions, BITS work, SciTech News, etc.</li> <li>Understand the different plugins that are available for the blog, and use some of them - like Google Analytics, g+ integration, etc.</li> </ol> <p>Finally, after posting couple of posts on Blogger and WordPress, I feel at home@Octopress.</p> <p>Cheers</p>Effectively enforcing eloquent English expression2004-09-07T17:21:00+08:002004-09-07T17:21:00+08:00Siddhant Shrivastavatag:sidcode.github.io,2004-09-07:/letters/effectively-enforcing-eloquent-english-expression/<section> <p>Today I managed to snag three <em>stars</em> from two classmates. Let me explain. <em>the protagonist clears his prepubescent throat. Out comes a distinct alto voice trained in some Hindustani Classical music</em>.</p> <p>This is a <em>game</em> that my classy class teacher conjured up. She wants us to actively improve conversational English …</p></section><section> <p>Today I managed to snag three <em>stars</em> from two classmates. Let me explain. <em>the protagonist clears his prepubescent throat. Out comes a distinct alto voice trained in some Hindustani Classical music</em>.</p> <p>This is a <em>game</em> that my classy class teacher conjured up. She wants us to actively improve conversational English as we stand at the cusp of magnificently mortifying middle school. The rules of the game are-</p> <blockquote> <p>Every student starts with 5 paper stars they craft and personalize. They have to give up one star to the student who catches them speak a language other than English (usually Hindi, but could be anything else really). At the end of the week, whoever loses all their stars are made to deliver a two minute impromptu speech in english. Whoever gets the most stars gets to relax for a day on the whole english speaking thing AND redistribute their extra stars to the ones who have less than five. A rising tide lifts all the boats. And guess what, it did just that!</p> </blockquote> <p>Conclusion of the experiment- we could speak and think in english at will without subconsciouly switching to other languages. This feat just took a month for our preteen prefrontal cortices to pick up.</p> <p>I also realized the following-</p> <blockquote> <p>You could gamify even the most mundane minutiae and will a bunch of pavlovian primitive preteens into anything you desire as a result. </p> </blockquote> <p>I&rsquo;ll return to my religious Pokemon (5 PM) and Digimon (5:30 PM) ritual on Cartoon Network.</p> <p><em>Fin</em></p> <p>Note from 2010s- This was a very primitive form of seeders and leechers in torrenting. We could have managed to game the game by ensuring no student runs out of stars. But we didn&rsquo;t. Children can be cruelly competitive at the lamest of games.</p> <p>Note from the roaring 2020s- Moments like this game are what made me me as english follows through me like body english. I always wondered there was a name what batsmen did after missing a ball. Fortunately, I found <em>body english</em> while watching some videos on proper lifting form. Body english or momentum lifting is a bad idea in the gym.</p> <p>Note from the lulled 2030s- There are no real people snagging stars, only bots. School life is either too cooperative or too competitive (depending on which virtual school the student attends). Everyone thinks only in English as that is the only language the brain-computer interface understands at the moment.</p> <p>Note from 2040s- Such amazing foresight and intervention by a teacher that challenged us then kids just the right amount. </p> </section>Ontologies of Orwell1984-01-31T20:50:00+08:001984-01-31T20:50:00+08:00Siddhant Shrivastavatag:sidcode.github.io,1984-01-31:/letters/ontologies-of-orwell/<section> <h2>Subtitle</h2> <p>People post keeping the future in mind. I&rsquo;ll live through the past and rewrite my version of history, because I can. Backposting ftw! </p> </section>The Use of Knowledge in Society1984-01-31T20:50:00+08:001984-01-31T20:50:00+08:00Siddhant Shrivastavatag:sidcode.github.io,1984-01-31:/letters/the-use-of-knowledge-in-society/<section> <h2>Subtitle</h2> <p>Backposting ftw! Jimmy Wales will be very inspired by this <a href="proxy.php?url=https://www.cato.org/sites/cato.org/files/articles/hayek-use-knowledge-society.pdf">essay</a> and start Wikipedia.</p> <p>The wisdom of the crowds (decentralized knowledge) or <strong>street smartness</strong> is as important as scientific knowledge.</p> </section>