[{"title":"About","url":"about.html","body":"This site is a mixture of my notes, a portfolio and a record of my hobbies and interests. It might be morphing into a Zettelkast It\u2019s a personal project and somewhere I can experiment with web and Most of my work is as a freelance I also do web developmen and blockchain If you have questions, comments or find a mistake, you can find me on twitter or email. Thanks for\u00a0readin \"0\"});"},{"title":"Recognizing Traffic Lights Using The Azure Custom Vision API","url":"traffic.html","body":"You can upload any image, I suggest googling dashcam traffic lights, or use one of the images below. Images should be smaller than 4mb and should be .jpeg, .bmp or .png. The Shortest Route To A\u00a0Demo This demonstrat was inspired by the job descriptio for a freelance role I recently applied for. The project involved recognisin faults with traffic lights and I wanted to see how quickly I could develop an end-to-end computer vision system that recognises and labels This is a relatively simple solution which prioritize speed, simplicity and low costs. I used the free tier of the Azure Custom Vision service to train and deploy the model. The model is trained to recognise and label Summary The model is hosted on Azure Custom Vision on the free\u00a0tier The model is trained on ~4500\u00a0imag Images are part of the DriveU Traffic Light\u00a0Data To improve the model I\u00a0would: Use many more\u00a0image Experiment with different Tune Method Find a good dataset. It would have taken too long to create my own labelled dataset so I needed to find a freely available set of labelled images. It turns out there are several to choose from. Waymo have a huge dataset that is freely available but I chose to use the DriveU Traffic Light dataset instead. It\u2019s well documented easily accessible and good\u00a0enoug Convert the images - the DTLD images are 16-bit .TIFF images. I needed .JPEG or .PNG images. I first converted the 16-bit .TIFF images to 8-bit, and then converted them to .JPEG. The DTLD dataset contains more image data and metadata than I needed so I simply ignored or stripped out what I didn\u2019t\u00a0nee Parse the label data to extract only the informatio I needed. The DTLD dataset contains labels that specify the location of the traffic lights in the images as well as the type of traffic light and what phase the lights are in. I was only interested in the location of the lights for this demo. I needed to convert the coordinate of the regions from absolute to\u00a0relativ Create a Custom Vision project, create custom tags, and upload pairs of images and labels in the required form. The documentat was good enough and whilst there were a few steps that were\u00a0uncle I was able to quickly figure out what to do, usually by clicking around to try a couple things and check the results. Each cloud platform has its own quirks and design concepts, and once you\u2019ve understood the pattern you can develop a good intuition for how each platform (in this case Azure) \u201cwants\u201d you to do\u00a0somethi Train the model. There aren\u2019t many options to choose from and the dataset wasn\u2019t very\u00a0big. Use the model to Create a simple UI on a static site (this page) using JavaScript and HTML. The JavaScript Fetch API is used to query the Custom Vision API. jQuery and some custom (vanilla) JavaScript is used to parse the results and create the interactiv elements on the\u00a0page. The model\u2019s results are shown by overlaying an HTML canvas element on top of the img element that shows the image that\u2019s been uploaded by the user. The regions and probabilit are drawn using HTML Canvas methods (strokeRec fillText etc). Next\u00a0steps The model is trained on images from German Cities. In order to generalise the model it should be trained using images from a wider distributi This could\u00a0incl Rural and Views from footpaths as well as\u00a0roads. Different Different cities and\u00a0countr It would be nice to let the user adjust the minimum probabilit threshold. Currently only results with a probabilit above 10% are\u00a0shown. Test\u00a0image You could use these images to test the model. You\u2019ll need to store them locally Test Image\u00a01 Test Image\u00a02 Test Image\u00a03 Test Image\u00a04"},{"title":"books","url":"book-notes.html","body":"A list of articles that are notes on\u00a0books."},{"title":"","url":"experience.html","body":"2021: Freelance Senior Data Scientist - Wayfair 2021: Freelance Blockchain Developer - Bitladon Blockchain integratio for a cryptocurr broker. Deploying and managing nodes for networks including Ethereum, Binance Smart Chain, Polkadot, Cardano, Tron, and others. Ansible and Docker were used to configurat The Rosetta API was used to create workflows including generating batches of addresses, notifying services about deposits, and 2020: Technical Founder and Web\u00a0Develo PipPip.ema Event-driv and long-term scheduled email delivery. Focus on writing then relax knowing that scheduled delivery is\u00a0guarant MoneyBar.n Personal financial dashboard and data 2019: Freelance Data Scientist - \u201cJohn is a creative and conscienti software engineer, who understand business requiremen well and translatin those into applicatio under tight deadlines. Highly recommende - NLP, data driven web-apps, mentoring. Working as an internal consultant I developed and delivered a range of tools. I worked with a wide variety of stakeholde and prioritize understand business needs and defining scopes, whilst using agile developmen practices. I advised my team on software developmen practices and tooling decisions, and mentored junior members to improve their coding and business skills. Tools included Python, and used Plot.ly Dash, WSL, and Azure 2019: Freelance Data Analyst - Uber \u201cJohn played a big role helping my team get our biggest and most challengin analytics tool across the line. He ramped up quickly, communicat well and was always responsive - Sankari Nair, Lead Developed an analytics tool using Python and Plot.ly Dash. The tool has a broad scope covering multiple regions, scales and business lines. I re-designe the app for scalabilit and performanc whilst increasing functional and provided a flexible foundation for new features to be developed after I left the project. Challenges included building custom refactorin legacy code, and processing large data 2018: Data Specialist - Blockport Design, create and maintain internal data tools for a cryptocurr exchange. As the only data specialist I did whatever needed to be done. I delivered tools to provide business insights including management informatio fraud analysis, tax reporting, KPI tracking, regulatory compliance and marketing and growth. I also delivered sentiment analysis of key social I worked with stakeholde across the business including back-end, founders, customer support, finance, DevOps and growth\u00a0tea Tools included: Python, Bokeh, Google Cloud Platform (BigQuery, DataStore, Data Studio), PostgreSQL Blockport has been bought by Bux 2017: Freelance Technical design and execution of a novel cryptograp crowd-fund method. Whilst working with a tech startup as a financial accountant I worked with stakeholde and external developers to design, test and execute an initial coin offering (ICO). I also led investor relations throughout the funding round and provided 2014: PwC Assurance - Banking & Data engineerin and analytics Deliver the ETL pipeline, analysis and visualizat of large financial datasets including financial journals and loan books. As a chartered accountant and external financial auditor of clients in banking and capital markets in London (HSBC, Barclays, Lloyds, BNP Paribas) I facilitate the transfer of data from client systems before transformi and loading them into our on-prem SQL environmen We recalculat clients\u2019 financial statements and mined the data for additional I also qualified as a chartered accountant with the ICAEW. 2010: PhD: Geotechnic In 2010 I began my PhD researchin granular materials at the University of Natural Resources and Life Sciences in Informatio about my research of silos and granular flows can be found here. 2009: Masters Degree: Civil and During my final year at Edinburgh University the Great Recession arrived. After I graduated I found a job at Starbucks and became curious about finance. I resolved to one day understand \u201chow banks work\u201d. \u2014 @johnmathe"},{"title":"fuse-search-snippets","url":"fuse-search-snippets.html","body":""},{"title":"fuse","url":"fuse.html","body":""},{"title":"landing-images","url":"landing-images.html","body":""},{"title":"title","url":"index.html","body":"This is the page\u00a0conte Content from this page isn\u2019t seen Images on the landing page need to be set in 1. landing.ht"},{"title":"Averages","url":"health.html","body":""},{"title":"pages","url":"pages.html","body":"This page is a list of links to other pages. Each of the linked pages is a collection of posts around a particular theme - book reviews, photograph Content is all in the template (pages.htm"},{"title":"Portfolio","url":"portfolio.html","body":"Content is all in the template. It\u2019s a list of links to other\u00a0page"},{"title":"Snippets","url":"snippets.html","body":""},{"title":"Analytics","url":"analytics.html","body":""},{"title":"Naming things is\u00a0hard","category":"snippet","url":"naming-things-is-hard.html","date":"14 October 2021","tags":"non-technical/entrepreneurship ","body":"Seth\u2019s blog - Background and\u00a0contex small business website Process Testing"},{"title":"Tove Lo -\u00a0Habits","category":"snippet","url":"tove-lo-habits.html","date":"14 October 2021","tags":"pop ","body":"wikipedia"},{"title":"Logging Best Practices\u00a0(notes)","category":"Technical/Engineering","url":"logging-best-practices-notes.html","date":"7 October 2021","tags":"programming, back-end ","body":"A great aricle by someone called Thomas about how to build useful logs. I find the inclusion of examples to be very useful, and the background informatio in the introducti is a nice\u00a0addit My notes (WIP): Log after not before - don\u2019t say what is going to happen, say what has happened. This makes it much more explicit and easier to read. The reader doesn\u2019t need to"},{"title":"How to get rich without getting lucky\u00a0(notes)","category":"Non-technical/Entrepreneurship","url":"how-to-get-rich-without-getting-lucky.html","date":"5 October 2021","tags":"naval-ravikant, mental-models, business ","body":"My notes and additions from a twitter thread created by @naval on 31 May\u00a02018. Background Seek wealth, not money or\u00a0status. Wealth is having assets that earn while you\u00a0sleep. You must own a piece of a business - you wont get rich by renting out your\u00a0time. You will get rich by giving society what it wants but doesn\u2019t know how to get. At\u00a0scale. Money is how we transfer time and\u00a0wealth Status is your place in the Understand that ethical wealth creation is possible. If you secretly despise wealth, it will elude\u00a0you. Ignore people playing status\u00a0gam Actions Learn to sell. Learn to build. If you can do both, you will Arm yourself with specific knowledge, and\u00a0levera Pick an industry where you can play long term games with long term\u00a0peopl Maximise positive feedback loops. Design and defend against negative All the returns in life, whether in wealth, relationsh or knowledge, come from positive feedback loops, The internet has massively broadened the possible space of careers. Most people haven\u2019t figured this out yet. Write down some\u00a0examp Play games where you iterate on past successes and experience to create more success and more experience This Pick business partners with high intelligen energy, and, above Knowledge is not the same as intelligen Its hard to estimate higher levels of expertise relative to your own. Additional knowledge can make someone look intelligen but you can\u2019t know if it would take 100, 1000 or 10,000 hours to acquire Don\u2019t partner with cynics and pessimists Their beliefs Building specific knowledge will feel like play to you but will look like work to\u00a0others. Specific knowledge is often highly technical or creative. It cannot be outsourced or\u00a0automat Specific knowledge is found by pursuing your genuine curiosity and passion rather than whatever is hot right\u00a0now. Specific knowledge is knowledge that you cannot be trained for. If society can train you, it can train someone else and replace\u00a0yo When specific knowledge is taught, it\u2019s through not\u00a0school Accountabi Embrace and take business risks under your own name. Society will reward you with equity, and\u00a0levera The most accountabl people have singular, public, and risky brands: Oprah, Trump, Kanye,\u00a0Elo Leverage - capital, people, no marginal\u00a0c \u201cGive me a lever long enough, and a place to stand, and I will move the earth.\u201d -\u00a0Archimed Capital means money. To raise money, apply your specific knowledge, with and show resulting good\u00a0judgm Fortunes require leverage. Business leverage comes from capital, people, and products with no marginal cost of replicatio (code and\u00a0media) Labor means people working for you. It\u2019s the oldest and most fought-ove form of leverage. Labor leverage will impress your parents, but don\u2019t waste your life chasing\u00a0it Capital and labor are permission leverage. Everyone is chasing capital, but someone has to give it to you. Everyone is trying to lead, but someone has to follow\u00a0you Code and media are permission leverage. They\u2019re the leverage behind the newly rich. You can create software and media that works for you while you\u00a0sleep. An army of robots is freely available - it\u2019s just packed in data centers for heat and space efficiency Use\u00a0it. If you can\u2019t code, write books and blogs, record videos and\u00a0podcas Leverage is a force multiplier for your judgement. - this seems related to comments by Geoffrey Hinton about developing your intuitions interview. Search through the article for all the mentions of\u00a0intuiti Learning and\u00a0practi Apply specific knowledge with leverage, and progress is\u00a0inevita You should be too busy to \u201cdo coffee\u201d, while still keeping an Set and enforce an aspiration personal hourly rate. If fixing a problem will save less than your hourly rate, ignore it. If outsourcin a task will cost less than your hourly rate, outsource\u00a0 Who you work with and what you work on are more important than how hard you\u00a0work. Work as hard as you\u00a0can. Thinking your own thoughts is tiring. Asking good questions is\u00a0hard. Doing is faster than\u00a0watch Developing good judgement requires experience if you have a chance to gain real experience then take it. Learning can amplify the benefits of\u00a0experie Real experience is more useful than a prestigiou course or\u00a0degree. Be patient and persistent shape your circumstan It doesn\u2019t mean you\u2019ve made a mistake if you can\u2019t do something real right\u00a0now. There is no skill called business. Avoid business magazines and business classes. Think about why and extend this to other\u00a0medi Passive, peace-meal knowledge acquisitio by itself does not lead to specific knowledge or expertise. Subscribin to newsletter or social media accounts offers quickly diminishin returns, at\u00a0best. Read long-form media which you actively looked for. Informatio that comes to you for free has competing interests which put you\u00a0second If you read only 1 book on a subject then you\u2019ll likely be a clone. If you read 2 books you\u2019ll grapple with confusion. Read 3 and you\u2019ll begin to form your own substantia opinions Become the best in the world at what you do. Keep redefining what you do until this is\u00a0true. There are no get rich quick schemes. That\u2019s just someone else getting rich off\u00a0you. Study game theory, psychology persuasion ethics, mathematic and computers. Real experience with skin in the game will teach you more than a book or a\u00a0professo There are lots of ways to grow beyond being a beginner, but no one can do the heavy-lift for you. A course or product that offers to teach you specific knowledge will give diminishin returns. The more of a beginner you are, the better the course will\u00a0appea Reading is faster Remember\u00a0w When you\u2019re finally wealthy, you\u2019ll realize that it wasn\u2019t what you were seeking in the first place. But that\u2019s for another\u00a0da Check in with your 70 year old self, and your 10 years older self. What do they think of you now? Are they sympatheti Proud? Do they say go faster, or slow down? Be kind to\u00a0yoursel There is a big difference between saying \u201cI am intimidate and \u201cI am feeling intimidate You can do\u00a0it. My top\u00a02: Embrace and take business risks under your own name. Society will reward you with equity, and\u00a0levera Capital means money. To raise money, apply your specific knowledge, with and show resulting good\u00a0judgm"},{"title":"Ultra-running\u00a0benchmarks","category":"snippet","url":"ultra-running-benchmarks.html","date":"1 October 2021","tags":"courtney-dauwalter, running, 150km, jeff-pelletier ","body":"7\u00a0days/wee 25\u00a0km/day Start easy. Last 50% begin to move through the\u00a0field. 40% - 50% of entrants DNF 90% of the course is\u00a0runnabl 50km: 6:10 min/km (Diez Vista\u00a050km 50km: 7:25 - 8:00 min/km (Beaver Flat 50 JP) 160km: 10:35 min/km (JP) 150km: 6:30 - 8:00 min/km (UTMB CD) training: 6:10 is slow enough to stay fresh. 5:10-5:30 is\u00a0fast."},{"title":"Vim built-in color\u00a0names","category":"snippet","url":"vim-built-in-color-names.html","date":"29 September 2021","tags":"vim ","body":"Script to output colors in a\u00a0buffer: :so SO question with"},{"title":"Lightning network\u00a0point-of-sale","category":"snippet","url":"lightning-network-point-of-sale.html","date":"28 September 2021","tags":"lightning, bitcoin, crypto ","body":"A project to demonstrat an offline, fast, point-of-s device to allow merchants and customers to make fast cheap transactio on the Github\u00a0rep lnurl super\u00a0prot Also lnbits Introducin OFFLINE <$10 #bitcoin LN tut coming soon, workshops at #hccp21 and @AdoptingB Check out the repo! the @lnbits wallet in the background when the transactio is made \ud83e\udd29) Ben Arc \ud83c\udff4\udb40\udc67\udb40\udc62\udb40\udc77\udb40\udc6c\udb40\udc73\udb40\udc7f\u270a\u26a1\ufe0f (@arcbtc) September 27, 2021"},{"title":"Make\u00a0files","category":"snippet","url":"make-files.html","date":"28 September 2021","tags":"automation, build-system ","body":"This article is a fairly practical introducti to the GNU make tool. It\u2019s not the article I was looking for though, so I might swap it out if I find a better\u00a0one"},{"title":"Linux performance analysis in 60\u00a0seconds","category":"snippet","url":"linux-performance-analysis-in-60-seconds.html","date":"27 September 2021","tags":"linux, sys-admin ","body":"A blog post with a checklist of steps to take and questions to ask when investigat a performanc issue on a Linux\u00a0serv"},{"title":"Stop reading the\u00a0news","category":"snippet","url":"stop-reading-the-news.html","date":"22 September 2021","tags":"humanity, culture, web ","body":"An article from the Farnham Street\u00a0blo"},{"title":"Rules for talking to\u00a0children","category":"Non-technical/Social","url":"rules-for-talking-to-children.html","date":"22 September 2021","tags":"children, communication ","body":"State the idea you wish to express as clearly as possible, and in terms preschoole can understand Example: \u201cIt is dangerous to play in the\u00a0street Rephrase in a positive manner. \u201cIt is good to play where it is\u00a0safe.\u201d Rephrase the idea, bearing in mind that preschoole cannot yet make subtle distinctio and need to be redirected to authoritie they trust. \u201cAsk your parents where it is safe to\u00a0play.\u201d Rephrase your idea to eliminate all elements that could be considered prescripti directive, or instructiv In the example, that\u2019d mean getting rid of \u201cask\u201d: \u201cYour parents will tell you where it is safe to\u00a0play.\u201d Rephrase any element that suggests certainty. That\u2019d be \u201cwill\u201d: \u201cYour parents can tell you where it is safe to\u00a0play.\u201d Rephrase your idea to eliminate any element that may not apply to all children. Not all children know their parents, so: \u201cYour favorite grown-ups can tell you where it is safe to\u00a0play.\u201d Add a simple motivation idea that gives preschoole a reason to follow your advice. \u201cYour favorite grown-ups can tell you where it is safe to play. It is good to listen to\u00a0them.\u201d Rephrase your new statement, repeating the first step.\u201d \u201cGood\u201d represents a value judgment, so: \u201cYour favorite grown-ups can tell you where it is safe to play. It is important to try to listen to\u00a0them.\u201d Rephrase your idea a final time, relating it to some phase of developmen a preschoole can understand \u201cYour favorite grown-ups can tell you where it is safe to play. It is important to try to listen to them, and listening is an important part of\u00a0growing As Arthur Greenwald, a former producer of the show, put it to me, \u201cThere were no accidents on Mister Rogers\u2019 Neighborho He took great pains not to mislead or confuse children, and his team of writers joked that his on-air manner of speaking amounted to a distinct language they Fundamenta Freddish anticipate the ways its listeners might misinterpr what was being\u00a0said source"},{"title":"View and change settings for GCloud CLI","category":"snippet","url":"view-and-change-settings-for-gcloud-cli.html","date":"22 September 2021","tags":"google-cloud-platform, cli ","body":"gcloud config set account gcloud auth list"},{"title":"Google\u00a0Pub/Sub","category":"snippet","url":"google-pub-sub.html","date":"20 September 2021","tags":"google-cloud-platform, message-systems ","body":"Google Pub/Sub has client libraries in all the usual languages. You can also construct the API calls yourself. This is a link to the If I were to use a JavaScript beacon to push a message to a topic, my web analytics engine (a collection of cloud functions) could subscribe to the\u00a0topic. This would have the advantage that if the site were flooded with traffic and the maximum number of function instances wasn\u2019t enough to handle all the events, then none of the page views (or other events) would be lost because Pub/Sub guarantees delivery and would keep trying to deliver the message for up to 7\u00a0days. Using the beacon to trigger the cloud functions directly wouldn\u2019t work at scale, because once the maximum number of instances are being triggered as frequently as the function takes to run, the endpoint would become unresponsi There is no caching\u00a0la However, the site isn\u2019t being flooded with traffic, and I have better things to do than fix stuff that isn\u2019t\u00a0brok"},{"title":"Google Cloud Storage - TTL and CORS\u00a0settings","category":"snippet","url":"gcp-storage-bucket-ttl-and-cors-settings.html","date":"20 September 2021","tags":"google-cloud-platform, storage, cloud, caching ","body":"Documentat explaining how to set CORS and TTL for a Create a JSON file with something like\u00a0this: [{ \"origin\": [\"...\"], \"method\": [\"GET\", \"PUT\"], 300 }] gsutil cors set cors.json gsutil cors get | jq"},{"title":"Handwritten letters at\u00a0scale","category":"snippet","url":"handwritten-letters-at-scale.html","date":"20 September 2021","tags":"automation, hardware, marketing, raspberry-pi ","body":"A twitter thread showing how a business started creating handwritte notes to customers using a plotter, and how they scaled up and further automated the\u00a0proces It would be cool to create something like this for western Europe. seems to be the main commercial implementa of this idea in NA. Competitor I still think it would be fun to make a simple photo printing service. Share a photo in a WhatsApp message and its printed and\u00a0delive"},{"title":"The dangers of social\u00a0media","category":"snippet","url":"the-dangers-of-social-media.html","date":"16 September 2021","tags":"technology, mobile, social-media ","body":"A ledger (what\u2019s a ledger? It\u2019s a webpage) from the \u201cCenter For Humane Technology showing some of the dangers of social\u00a0med The \u201cDo unto others\u201d section is outstandin It describes the views of some tech company leaders, including how they limit usage of devices and social media for themselves and their\u00a0kids"},{"title":"Visual mathematical\u00a0proofs","category":"snippet","url":"visual-mathematical-proofs.html","date":"11 September 2021","tags":"math ","body":"A question on Stack Overflow asking for great visual proofs of right the area of a\u00a0circle archive"},{"title":"1 Peter 1 vs\u00a010","category":"Non-technical/Journal","url":"1-peter-1-vs-10.html","date":"10 September 2021","tags":"bible ","body":"Concerning this salvation, the prophets who prophesied about the grace that was to be yours searched and inquired carefully, inquiring what person or time the Spirit of Christ in them was indicating when he predicted the sufferings of Christ and the subsequent glories. It was revealed to them that they were serving not themselves but you, in the things that have now been announced to you through those who preached the good news to you by the Holy Spirit sent from heaven, things into which angels long to look. The salvation that Peter is referring to is described previously He was encouragin us that even though our salvation is mysterious it is very valuable, and that the inexpressi joy that we feel is not unusual or a cause for concern. Don\u2019t be uncomforta Grace and peace. To encourage us even more, Peter tells us that even the prophets (he implicitly assumes his readers are familiar with the prophets, so I guess they\u2019re definitely Jewish, as guessed in the first post in this series) searched for understand about our salvation. They wanted to know who the Christ would be and when he would arrive. I guess they initially thought they were doing this to satisfy their own curiosity because it had to be revealed to them that this was in fact a service to the saints that would come after them. Interestin it says that it was the Spirit who predicted Christ\u2019s sufferings and the following glories. I would not have expected that the Spirit of God needed to do any predicting about Christ\u2019s sufferings or glories. Are you even able to predict something if you\u2019re omnipotent Back to the text, Peters says that things have been announced to us via those who preached the good news to us by the Hold Spirit, who was sent from heaven. Preaching by the power or enabling of the Holy Spirit has revealed things to us that angels long1 to know. We don\u2019t know everything and we still have questions, but some of our questions are answered, some of them inexpressi and some of the things we find mysterious or uncomforta are supposed to be so until our salvation is complete. Grace and peace. Hang in there, you\u2019re doing ok.Long, or longed? I would have expected it to be \u201clonged to know\u201d, but its not. Does chronology get messed up from our perspectiv if the other party is outside of time? \u21a9"},{"title":"9/11 and the American\u00a0psyche","category":"snippet","url":"9-11-and-the-american-psyche.html","date":"10 September 2021","tags":"history, essay ","body":"An essay in the FT: \u201cTwenty years later, the political and psychologi consequenc of the attack have become archive"},{"title":"How doctors\u00a0die","category":"snippet","url":"how-doctors-die.html","date":"9 September 2021","tags":"humanity, death ","body":"An article about why doctors want less treatment for terminal illnesses than archive"},{"title":"Life","category":"snippet","url":"life.html","date":"8 September 2021","tags":"aphorism ","body":"If youth is wasted on the young, then life is wasted on the\u00a0living"},{"title":"Slavery","category":"snippet","url":"slavery.html","date":"8 September 2021","tags":"humanity ","body":"This article about how people become slaves is very It puts a very different perspectiv on my life, ambitions and"},{"title":"Cloud Functions Minimum\u00a0Instances","category":"snippet","url":"cloud-functions-minimum-instances.html","date":"8 September 2021","tags":"google-cloud-platform, serverless ","body":"Google Cloud Platform have introduced a \u201cminimum instances\u201d features in the Cloud If you have a function using 128MB of memory (the smallest option), it will cost a minimum \u20ac0.061 per day (\u20ac1.86 per month) to run 1"},{"title":"nmap","category":"snippet","url":"nmap.html","date":"4 September 2021","tags":"networking, cli ","body":"sudo nmap -sP 192.168.1. | grep \"Nmap\" Thanks to Jeff Geerling (again) I found this nifty command to see what devices are connected on a local\u00a0netw Found in this article about setting up a\u00a0pi-hole."},{"title":"1 Peter 1 vs\u00a03","category":"Non-technical/Journal","url":"1-peter-1-vs-3.html","date":"27 August 2021","tags":"bible ","body":"May grace and peace be multiplied to you. Part 1: Blessed be the God and Father of our Lord Jesus Christ! According to his great mercy, he has caused us to be born again to a living hope through the resurrecti of Jesus Christ from the dead, to an inheritanc that is imperishab undefiled, and unfading, kept in heaven for you, who by God\u2019s power are being guarded through faith for a salvation ready to be revealed in the last time. Part 2: In this you rejoice, through now for a little while, if necessary, you have been grieved by various trials, so that the tested genuinenes of your faith - more precious that gold that perishes though it is tested by fire - may be found to result in praise and glory and honor at the revelation of Jesus Christ. Part 3: Though you have not seen him, you love him. Though you do not now see him, you believe in him and rejoice with joy that is inexpressi and filled with glory, obtaining the outcome of your faith, the salvation of your souls. I thought that writing about 1 paragraph per post would be easy, maybe even too short, but this is so dense that I\u2019m not sure. The last three highlighte paragraphs are all one paragraph originally , I\u2019ve just split them up to make it easier to write about. Part 1 After Peter\u2019s greeting (previous post), he starts his letter by praising Father God - the Lord of Lords and King of Kings. Peter says that it is by God\u2019s mercy that we have been born again, and that we are able to enjoy a living hope. As soon as Peter says this he begins to explain how, and he uses a chain or reasoning again like he did in his greeting. I\u2019m going to try and paraphrase in order to make the text less dense and make its meaning more obvious: Because of God\u2019s mercy, we have a hope that is alive and so significan it is as if we have been reborn. Our life is new, reset, fundamenta different. This is a direct result of Jesus dying and resurrecti The living hope seems to be hoping (or trusting) in the inheritanc which is promised but not yet received. The inheritanc is described as: Imperishab - cannot be destroyed. Undefiled - cannot be made dirty, less pure, less valuable. Unfading - doesn\u2019t get old or less good as time passes. It doesn\u2019t diminish. Kept in heaven for you - deliberate reserved in heaven for you, in particular By God\u2019s power, you are being guarded until your salvation. This protection is received via your faith. Protection comes through your faith. Not because of it, or from it, but through it. The purpose of the protection is to keep you safe until your salvation, which will only be revealed to you in the last time. Your salvation is mysterious because it hasn\u2019t been revealed yet. This last point is itself mysterious Part 2 The tested genuinenes of your faith is more precious than gold. Not just the genuinenes integrity, or sincerity of your faith, but the tested sincerity of your faith. This tested sincerity is more precious than gold. Your faith is contrasted with gold that will eventually decompose - even really high quality gold that is ultra refined. But your tested and sincere faith is more resilient and more valuable than gold. The process that creates a tested and genuine faith in Jesus is therefore to be appreciate because of the end result. If you have a living hope of an inheritanc through Jesus resurrecti then rejoicing would be a good response even if it is causing you grief and many trials. The trials and grief will ultimately result in praise and glory and honor during the revelation (revealing of Jesus Christ. I\u2019m not sure if this is glory and honor for Jesus, or for the person who has a tested and genuine faith. Maybe both? Part 3 I don\u2019t know if this passage is instructio or reassuring or both. I guess it\u2019s both at the same time - you should love him, even though you\u2019ve never seen him. And also, don\u2019t feel uneasy about having never seen him. It\u2019s unusual to love someone you\u2019ve never seen, but in this situation it\u2019s normal. Don\u2019t worry. Grace and peace. The same goes for inexpressi joy - you\u2019re hope in Jesus, his resurrecti and your coming resurrecti makes you happy in ways that you didn\u2019t expect and can\u2019t really articulate or explain - this is ok! It is more than ok - it is evidence of the sincerity and truthiness of your faith and is expected. Believing in someone you\u2019ve never seen, and loving someone you\u2019ve never seen is going to result in you obtaining salvation. It\u2019s unusual, and different, but don\u2019t worry. Grace and in my English bible. Does the original text even have paragraphs \u21a9"},{"title":"Understanding GCP\u00a0charges","category":"snippet","url":"understanding-gcp-charges.html","date":"27 August 2021","tags":"google-cloud-platform ","body":"A clear article solving most of my problems around understand why I\u2019m incurring charges on Google Cloud Platform (Gcloud, GCP) when I\u2019m clearly within their free usage\u00a0limi The deployment costs section in this page is also\u00a0usefu"},{"title":"My dead dad\u2019s\u00a0journal","category":"snippet","url":"my-dead-dads-journal.html","date":"26 August 2021","tags":"humanity ","body":"Your relationsh will come to define your life more than anything\u00a0e A blog post about someone who read their dad\u2019s journal after he died. Its about regret, grief, and struggles against internal Its about the tangible consequenc of The comments are"},{"title":"1 Peter 1 vs\u00a01","category":"Non-technical/Journal","url":"1-peter-1-1.html","date":"25 August 2021","tags":"bible ","body":"People usually say that the bible is a collection of books, but some of the books are actually letters. Two of the letters were written by Peter, who was one of Jesus\u2019 disciples. \u201c1 Peter\u201d is the first\u00a0lett The other numbers are the chapters and verses. They\u2019re used to divide the text so that specific parts can be referred to easily. They\u2019re not part of the original text - chapters were introduced around the 13\\(^{th}\\ century, and verses in the 16\\(^{th}\\ \\(^{1}\\)To those who are elect exiles of the Dispersion in Pontus, Galatia, Cappadocia Asia, and Bithynia, according to the foreknowle of God the Father, in the sanctifica of the Spirit, for obedience to Jesus Christ and for sprinkling with his blood: May grace and peace be multiplied to you. This is the start of a letter to a specific group of people. It\u2019s intended to be encouragin and dense from the very beginning. It is intended that the letter is remembered and paid attention\u00a0 The sentence structure is awkward when its translated into modern English because its a translatio Presumably in the original language you could create long statements with many clauses in them and people would be comfortabl with that. I\u2019m unaware of the letter writing customs of the day, so maybe Peter is using a very normal type of introducti or maybe he is deliberate imitating the style of a specific type of letter, or\u00a0person. The first sentence implies that exiles of the Dispersion (what dispersion I don\u2019t know.) are not exiles by accident. Even though I bet becoming exiled felt like an unplanned interrupti to their life. The first sentence says that God chose them, and that there is a purpose. There is a chain of reasoning that involves God the Father, God the Son and God the Holy Spirit - the trinity. One God in three persons\u2026 I don\u2019t get it, I can\u2019t explain it, but I\u2019m going to try and work with According to the foreknowle of God the Father\u00a0\u2192 in the sanctifica of the Spirit\u00a0\u2192 for obedience to Jesus Christ and for sprinkling with his\u00a0blood It\u2019s like 1 and 2 and requiremen for 3, because \u201cfor\u201d means \u201cthe reason why something was\u00a0done\u201d. I should find out what \u201csprinklin with his blood\u201d means, instead of guessing. Clearly its a weird and gross\u00a0idea It means\u2026 being marked by Jesus\u2019 blood. It seems clear enough that this is the blood that was spilt when he was sacrificia murdered, and it seems that the benefit or the reason why this would be desirable is already obvious to the readers. Having Jesus\u2019 blood on you is to associate yourself with the benefits of that sacrifice as well as to count the cost of the\u00a0sacrif It\u2019s probably a line of reasoning that would be intuitive to believers with a Jewish background if they\u2019ve previously sacrificed animals in temples during various festivals. Is this letter written to a group of believers with a Jewish (rather than Back to the chain of reasoning: the foreknowle of the Father and the sanctifica of the Spirit is necessary for being associated with Jesus sacrifice and for being obedient to Jesus. That\u2019s a big conclusion from only the May grace and peace be multiplied to you. I like that exhortatio so much I could get it as a tattoo. Grace and Peace, better than fine dining and a comfortabl bed. Also, chewing over the implicatio of the previous sentence is enough to cause confusion and angst - so the exhortatio is well timed. Chill - even though you\u2019ve been exiled, you weren\u2019t planning on being a migrant, and God planned this for you, and this is part of your necessary sanctifica in order to be associated with Jesus\u2019 sacrifice - you can be full of peace, and there is lots of grace (which is forgivenes and patience, and maybe gentleness available to\u00a0you. Next\u00a0sente if { var align = \"center\", indent = \"0em\", linebreak = \"false\"; if (false) { align = (screen.wi"},{"title":"Miller CLI","category":"snippet","url":"miller-cli.html","date":"25 August 2021","tags":"cli, tools ","body":"Miller is like awk, sed, cut, join, and sort for data formats such as CSV, TSV, tabular JSON Try this if you\u2019re operating on JSON of CSV data, and you want to use column names instead of"},{"title":"Conditionally setting your git\u00a0config","category":"snippet","url":"conditionally-setting-your-git-config.html","date":"24 August 2021","tags":"vcs, git ","body":"Blog post showing how you can set git variables depending on where you are in a It hinges on conditiona includes."},{"title":"Daddy-Daughter To-Do\u00a0List","category":"Non-technical/Journal","url":"daddy-daughter-to-do-list.html","date":"22 August 2021","tags":"daughter, family ","body":"Things we want to do\u00a0togethe Make some\u00a0furni Run to the\u00a0beach \u00a0Parkour Go on bike\u00a0rides Go to work together (when she is\u00a0older) Go on long train journeys together (Paris and\u00a0Salzbu Go to restaurant and have Take interestin people to lunch and ask them\u00a0quest"},{"title":"Conway\u2019s\u00a0Law","category":"snippet","url":"conways-law.html","date":"16 August 2021","tags":"organisations ","body":"Organizati which design systems \u2026 are constraine to produce designs which are copies of the communicat structures of these organizati - Melvin\u00a0Con In\u00a0reverse If the architectu of the system and the architectu of the organizati are at odds, the architectu of the organizati wins - Ruth\u00a0Malan"},{"title":"Elon Musk\u2019s Engineering\u00a0Philosophy","category":"Technical/Engineering","url":"elon-musk-s-engineering-philosophy.html","date":"8 August 2021","body":"Whilst giving Tim Dodd a tour of a SpaceX facility, Musk described an interestin five-step Make the requiremen less dumb. The requiremen are definitely dumb; it does not matter who gave them to you. He notes that it\u2019s particular dangerous if an intelligen person gives you the requiremen as you may not question the requiremen enough. \u201cEveryone\u2019 wrong. No matter who you are, everyone is wrong some of the time.\u201d He further notes that \u201call designs are wrong, it\u2019s just a matter of how\u00a0wrong. Try very hard to delete the part or process. If parts are not being added back into the design at least 10% of the time, not enough parts are being deleted. Musk noted that the bias tends to be very strongly toward \u201clet\u2019s add this part or process step in case we need it.\u201d Additional each required part and process must come from a name, not a department as a department cannot be asked why a requiremen exists, but a person\u00a0can Simplify and optimize the design. This is step three as the most common error of a smart engineer is to optimize something that should not\u00a0exist. Accelerate cycle time. Musk states \u201cyou\u2019re moving too slowly, go faster! But don\u2019t go faster until you\u2019ve worked on the other three things\u00a0fir Automate. An important part of this is to remove in-process testing after the problems have been diagnosed; if a product is reaching the end of a production line with a high acceptance rate, there is no need for archive"},{"title":"Pen-testing web\u00a0apps","category":"snippet","url":"blog-post.html","date":"6 August 2021","tags":"penetration-testing, hacking, web-apps, credentials ","body":"A blog post about how someone compromise a group of web\u00a0apps. It lists a series of technologi and techniques that the author uses as they progress These would make a useful list of things to know in order to build safe web-apps and not repeat the mistakes of the unfortunat"},{"title":"Running through\u00a0adversity","category":"snippet","url":"running-through-adversity.html","date":"6 August 2021","tags":"hard-rock, running, sport ","body":"The toughness of elite ultra-runn is Just one of the problems Sabrina Stanley encountere would be enough to justify dropping\u00a0o archive"},{"title":"Starbase Tour with Elon\u00a0Musk","category":"snippet","url":"starbase-tour-with-elon-musk.html","date":"4 August 2021","tags":"elon-musk, starbase, engineering ","body":"Part 1 of an incredible interview and tour of starbase with Elon\u00a0Musk archive"},{"title":"Remote\u00a0Working","category":"snippet","url":"remote-working.html","date":"2 August 2021","tags":"remote, engineering ","body":"I live and work near\u00a0Amste My manager is in\u00a0Berlin A stakeholde is in\u00a0Boston Another stakeholde is in\u00a0India Another stakeholde is in\u00a0Ireland \u00af\\_(\u30c4)_/\u00af"},{"title":"Athletes, Careers, and Mental\u00a0Health","category":"snippet","url":"athletes-and-mental-health.html","date":"2 August 2021","tags":"sport, competition, comparmentalization, psychology ","body":"Being very good at one particular thing can make it easy to be bad at normal\u00a0thi Getting over\u00a0gold Insecure Overachiev Searching the FT brings up several"},{"title":"WTF\u00a0Python","category":"snippet","url":"wtf-python.html","date":"2 August 2021","tags":"python, programming ","body":"A repo of curious"},{"title":"Beach\u00a0Photos","category":"Non-technical/Photographs","url":"beach-photographs.html","date":"1 August 2021","body":""},{"title":"Heuristics for effective software\u00a0development","category":"snippet","url":"heuristics-for-effective-software-development.html","date":"26 July 2021","tags":"engineering, teams, organisations ","body":"\u201cWithout psychologi safety, respect, and trust, none of the following is\u00a0possibl \u201cThe best ways to work are collaborat Negotiatio is not collaborat Isolated individual making heroic efforts are never as effective as collaborat groups. We get the best results when customers, business people, and developers literally Blog\u00a0post"},{"title":"Move a file between git\u00a0branches","category":"snippet","url":"move-a-file-between-git-branches.html","date":"14 July 2021","tags":"git ","body":"Checkout the branch where you want the file to by copied to,\u00a0then: git checkout Use a commit hash to pull files from any\u00a0commit Multiple files and directorie can be\u00a0specifi Overwrites the\u00a0file SO"},{"title":"Start with Finance to transform IT","category":"snippet","url":"start-with-finance-to-transform-it.html","date":"14 July 2021","tags":"engineering, organisations, business, corporations ","body":"Zwischenzu blog post arguing that to achieve a significan change in an organisati you need need\u00a0to: Get\u00a0fundin Persuade the finance department to give you\u00a0money. Understand what they\u00a0value Understand their cash\u00a0flows Understand how and why customers or clients part with their\u00a0mone Understand business constraint (legal, The five whys approach to \u201cConsider a deeper structural cause of cultural problems in change management how money flows through \u201cIf you want to transform IT in an enterprise start with finance. If you can crack that, you\u2019ve a chance to succeed with sec and controls functions. If you don\u2019t know why it\u2019s important to start with finance, you\u2019ll definitely fail\u00a0\u201c"},{"title":"The Worst Volume Control UI","category":"snippet","url":"the-worst-volume-control-ui.html","date":"14 July 2021","tags":"ui ","body":"Hilarious article from UI Collecitve showing the results of a competitio to design the worst possible volume control interface. \ud83d\ude02 \ud83d\ude02\u00a0\ud83d\ude02"},{"title":"Playing with Google Cloud\u00a0Platform","category":"Technical/Engineering","url":"playing-with-google-cloud-platform.html","date":"13 July 2021","tags":"cloud, google-cloud-platform, serverless ","body":"Building my own web analytics has been a gateway to learning more about Google Cloud Platform. So far I\u2019ve used DataStore, BigQuery, Cloud Functions, Pub/Sub, Storage Buckets and Scheduler. Version 1 My first version of the analytics tool took the following form: Logger Function When the browser navigates to a page on the site, a JavaScript beacon is sent which triggers a cloud function. The function parses the page URL and the IP address, and creates a record in the database. Aggregator Function Each time the analytics page is loaded, a cloud function is triggered that gets every record from DataStore, parses the data and returns a JSON object containing the aggregated data. The browser receives the JSON, parses it and creates some charts and tables. The good: It works, it was quick and simple to build. The bad: Its expensive. Loading the analytics page is slow - when the DataStore was small it took 3 or 4 seconds, after a few thousand page views it took about 40 seconds. Conclusion - keep the logger function as it is, but improve the aggregator function. Version 2 The second version still used DataStore but was much more efficient. It didn\u2019t read the entire database and generate the aggregated results every time the analytics page was viewed. Instead, a cloud function periodical collected all the records in the datastore database and calculated the results. The results were written to a JSON file and sent to a storage bucket. When the analytics page was loaded in a browser, the browser collects and processes the JSON file from the bucket. This is much faster, performant and cheaper than creating a new JSON object each time the analytics page is viewed. The good: The analytics page loads at the same speed regardless of how much data has been aggregated and how frequently the analytics page is being viewed. Performanc issues have been solved, though I still don\u2019t think DataStore is the best database solution for this use case. The bad: DataStore seems expensive - I am being charged for AppEngine services (which I don\u2019t really understand but is caused by using DataStore) If I can get monthly costs down to about a cup of coffee (about \u20ac4/month or \u20ac0.15/day) then I don\u2019t mind running it indefinite Version 3 Use BigQuery instead of DataStore. BigQuery is a Data Warehouse that is well suited for analytics. It is not well suited for transactio use cases - where data is being read, updated or created many times per second. This is fine for my use case - the Page Logger function writes a record to a BigQuery table each time each time a page view is logged. During times of high traffic it\u2019s possible that concurrenc issues might arise and some page views will be lost, but this isn\u2019t an issue 99% of the time. My site traffic is very light. I believe I could use a newer API that google recently released to solve this problem but for now I\u2019ll use the normal API. The rest of the process is unchanged - the aggregator function periodical reads the (BigQuery) database, crunches that data and sends a JSON file to a storage bucket. The good: This is completely free. The analytics page can be viewed quickly regardless of the amount of site traffic. The bad: Under heavy traffic some page views might be lost due to a limit on how quickly new rows can be added to BigQuery tables. Using a new API might resolve this. Conclusion Totally free tools forever The combinatio of Cloud Functions, Storage Buckets and Big Query (along with Scheduler and Pub/Sub) seems really versatile and I think there are many interestin things that could be done by combining these services1 . Using them all for free (my usage is well within the free tier) makes the possibilit even more interestin Having compute and storage services running indefinite in the cloud for free is amazing. Documentat Aggregate by calendar month An improvemen to this analytics setup would be creating aggregated metrics for each calendar month and storing them in separate JSON files. This would prevent data older than one month being processed repeatedly and create a cap on the amount of computatio effort required (the maximum amount of data processed by one cloud function instance would become capped at one month). If the browser wanted to display more than one month of data, it would simply request more than one JSON file from the storage bucket. TODO Frontend - the DataTables column containing the date should be sorted as a Date object. It is being sorted like a normal string. Backend - create separate JSON files for each month. The question then becomes: \u201cJust because you could do it, should you do it?\u201d \u21a9"},{"title":"Daughter","category":"snippet","url":"daughter.html","date":"12 July 2021","tags":"family ","body":"Yesterday my daughter asked me to write a page in"},{"title":"Moral\u00a0tyranny","category":"snippet","url":"moral-tyranny.html","date":"12 July 2021","tags":"oppression, consent ","body":"Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive It would be better to live under robber barons than under omnipotent The robber baron\u2019s cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience They may be more likely to go to Heaven yet at the same time likelier to make a Hell of\u00a0earth. This very kindness stings with intolerabl insult. To be \u201ccured\u201d against one\u2019s will and cured of states which we may not regard as disease is to be put on a level of those who have not yet reached the age of reason or those who never will; to be classed with infants, imbeciles, and - C. S.\u00a0Lewis"},{"title":"Upgrading Cryptographic\u00a0Libraries","category":"snippet","url":"upgrading-cryptographic-libraries.html","date":"10 July 2021","tags":"hashing, versioning ","body":"Blog post about how to make it easier to upgrade a cryptograp or Django encodes passwords for database storage like\u00a0this: Interestin Giovanni Collazo emphasises that we should design systems for change, which initially seems pretty close to contradict YAGNI, but the answer lies in the\u00a0contex"},{"title":"Startup Engineering\u00a0Lessons","category":"snippet","url":"startup-engineering-lessons.html","date":"10 July 2021","tags":"startup, engineering ","body":"Lessons of a startup engineer is a great blog post from Todd Wolfson. So great that I might write notes on it like I would a\u00a0book. archive"},{"title":"Poisson\u2019s\u00a0Equation","category":"snippet","url":"poissons-equation.html","date":"6 July 2021","tags":"math ","body":"A great article introducin and showing the relevance of"},{"title":"Thomas\u00a0Aquinas","category":"snippet","url":"thomas-aquinas.html","date":"6 July 2021","tags":"theology, history ","body":"His works in English and\u00a0Latin."},{"title":"Can an explanation of an historical event ever be completely\u00a0true?","category":"Non-technical/Journal","url":"historical-truths.html","date":"6 July 2021","tags":"history ","body":"We use historical events as examples to learn from, but is it possible to acquire a true understand of a The answer is important because we use that understand to learn by example, and identify patterns of cause and effect. Is it possible for a normal person to do this, or does it require training, lots of time, or special skills? Is it impossible for\u00a0everyo I started thinking about this when I had the When we think we understand why something historical happened, all we\u2019ve probably done is accept a story that Limitation An account of an event must necessaril be a simplifica - not all details can be recorded or learned. This is ok because most details are not pertinent and have no But how do we choose which details are included, and how do we verify that the details were How can we be sure that a lesson or conclusion based on historical events is\u00a0reliabl The truthfulne of a historical story could be thought of as a value on a scale ranging from completely false to perfectly true. I think that there are mechanisms that push popular or resilient narratives towards the middle of this scale and away from As a version of a story approaches the dishonest end it will contain an increasing number of errors or omit an increasing number of pertinent facts. This has the effect\u00a0of: Increasing the likelihood and frequency of someone hearing the story and rebutting Making it harder to align the assertions and implicatio with existing understand of\u00a0reality At the opposite end of the scale, a story is unlikely to be Adding truth requires adding complexity It is easier to create or capture a simple story than a As a story\u2019s detail and depth increases so do the resources required to communicat it. Each narrative is competing for attention and a complex story requires more resources to broadcast and listen to than a simple story. Those with the resources to do so will want the story to benefit them in some\u00a0way. This creates incentives to omit inconvenie truths Implicatio I think that there is unfortunat no substitute for the hard work of coordinati disparate informatio because the truthiness of a conclusion is generally proportion to the inconvenie of the effort spent forming\u00a0it We are predispose to choose convenienc over inconvenie and this\u00a0enabl History to be written by the \u201cwinning side\u201d, who have more resources than the \u201closing\u00a0si Complex events to be simplified into expedient and As time passes, the practical benefit of holding a view that differs from a popular narrative decreases. This reduces any incentive to challenge a popular\u00a0vi This creates a feedback loop that makes it increasing difficult for a younger generation to discover informatio about historical events that challenge a An\u00a0heurist How much evidence did I collect myself, that wasn\u2019t brought to my attention by an algorithm or by someone\u00a0el Was WW2 a battle of the good against the bad, or the bad against the really bad, or Why did the Allies win WW2? Was the influence of government on social freedoms in America 100 years ago dissimilar to China\u00a0toda Racial and ethnic discrimina was normal 100 years ago - it appears to have been so universall accepted that I\u2019m led to question the of modern attitudes about human nature and\u00a0morali"},{"title":"Load-testing my Web Analytics\u00a0Tool","category":"Technical/Web","url":"load-testing-web-analytics-tool.html","date":"2 July 2021","tags":"google-cloud-platform, cloud-functions, api ","body":"Table of Contents Background The Hacker News affect Bad\u00a0news Good\u00a0news API traffic for the\u00a0tool Dashboard for the get-analyt cloud\u00a0func Dashboard for the The Solution Idea 1: Use global\u00a0obj Idea 2: Store the results themselves in the\u00a0databa Idea 3: Forget DataStore, use\u00a0bucket Background I posted a previous article (about building an analytics tool) onto the Hacker News forum. It was quickly buried and didn\u2019t get any\u00a0attent To my surprise, I received an email from a Hacker News administra (Daniel) explaining that it was a good quality post and would be boosted to the front page at a random time within the next couple of\u00a0days. Sure enough, in the early hours of the next morning, the post was boosted. I woke up to various notificati that people had started following me on twitter, which never happens. After delegating the kid\u2019s breakfast duties, I logged into GCP to see what affect the extra traffic had on my The Hacker News\u00a0affec Traffic had increased by about 30x and my hastily built tool was looking very sub-optima Two problems stood out - the aggregated analytics data was taking anywhere from 20 - 30 seconds to load (up from around a passable-i 5 under normal conditions and I was running up a bill of Bad\u00a0news The reason for both of these problems was a shockingly inefficien and lazy approach to serving the\u00a0analyt Each time the analytics page was loaded, a cloud function would fetch all the data in the DataStore database, munch all that data and return a freshly derived blob of JSON. Never mind that almost the exact same computatio had occurred hundreds of times\u00a0alre As the amount of data in the DataStore increases, so does the time required to serve the analytics page. In the second chart below (dashboard for the get-analyt cloud function), it looks like the execution time increases at a rate of O(log\u00a0n). Good\u00a0news The good news though was that the function was handling the extra traffic smoothly. You can see in the dashboard image below (click on it) that almost every request was completed in less than 200ms, which I think is fine for a background process. I could also see the active instances scaling up and down well within its preset limits, as\u00a0expecte API traffic for the\u00a0tool Dashboard for the get-analyt cloud\u00a0func Dashboard for the function The\u00a0Soluti I began to ponder the importance of all the things I didn\u2019t know about databases, and what DataStore might be good and bad at doing. Scrolling through the documentat I could see google boasting of super quick writes, but not super quick reads. I\u2019d already seen how many API calls were being made to the Cloud DataStore API and knew I\u2019d probably have to redesign part of the\u00a0tool. Idea 1: Use global\u00a0obj I attempted a few easy wins, mostly using the idea that if an instance of a function was invoked multiple times before being powered-do then global objects would still be available in\u00a0memory. If I put the data collected from the DataStore into a global object then I could check for its existence in subsequent function calls. This would save a lot of API calls and likely remove the largest bottleneck saving my readers 10+ seconds of watching a For whatever reason, this didn\u2019t work. Even if it had, the tool could still be vastly improved by taking a different approach that would be even faster and also reduce costs. I\u2019d like to have this tool running indefinite so reducing daily costs to an absolute minimum is\u00a0importa Idea 2: Store the results themselves in the\u00a0databa It was obviously inefficien to repeat the same calculatio multiple times. A good long-term solution would require aggregatin the data periodical and then fetching and serving these aggregated data to the\u00a0client I tried putting the JSON into the DataStore using a different key, but ran into errors about the data for each entity being too large. Even if I split the aggregated data into multiple component parts it would still be too large, and would grow over time. I guess DataStore isn\u2019t meant to be used like\u00a0this. I probably could have pursued this idea a bit further, but I didn\u2019t want to change the structure of the JSON blob served to client. If I did change it then I\u2019d need to rewrite the client side JavaScript as\u00a0well. Client side work is faster than back-end, but writing JavaScript is fiddly compared to Python in my opinion. There\u2019s always multiple ways of doing a thing, and several versions of an API, so googling a solution isn\u2019t as simple as for\u00a0Python Idea 3: Forget DataStore, use\u00a0bucket Final idea - store the results as a JSON blob in a Storage bucket and point the client at the bucket instead of the Turns out this is a super fast and efficient solution. now loads in less than half a second, and the only variable costs are egress on the bucket, which will be much smaller than the comparable costs of running a The computatio expense of calculatin the analytical results is fixed and decoupled from the number of page views using the Every few minutes Cloud Scheduler targets a Pub/Sub\u00a0to The topic triggers a The Cloud Function then: Queries the DataStore and collects the\u00a0data. Calculates the Generates a JSON blob containing the\u00a0result Pushes the JSON to a storage bucket which is available to a\u00a0client. The aggregated results for days other than the current day are still needlessly recalculat - once midnight rolls around the results are clearly not going to keep on\u00a0changin Instead of having one JSON blob containing data for all the last 30 days, I could have a blob for each day (or perhaps each week). This would reduce the amount of data extracted from the DataStore. This would reduce costs and"},{"title":"Edward Hopper\u2019s\u00a0Paintings","category":"snippet","url":"edward-hopper-s-paintings.html","date":"30 June 2021","tags":"art, painting, photography ","body":"archive"},{"title":"Some experiences can be taught, but some must be\u00a0lived","category":"snippet","url":"some-experience-can-be-taught-some-needs-to-be-lived.html","date":"29 June 2021","tags":"meta, advice ","body":"\u201cI have learnt that failure is my -"},{"title":"Georges St-Pierre Training\u00a0Meta","category":"snippet","url":"georges-st-pierre-training-meta.html","date":"29 June 2021","tags":"sport, meta, training ","body":"Contains too much conjecture at the start, but becomes At high levels of competitio the difference between \u201cgood\u201d and \u201cgreat\u201d is partly determined by how much pain you are willing to\u00a0experie There are great benefits from training in a Anderson Silva is a ballet\u00a0dan Conor McGregor does Israel Adesanya Georges St-Pierre archive"},{"title":"Validating CloudFlare\u00a0analytics","category":"Technical/Web","url":"validating-cloudflare-analytics.html","date":"29 June 2021","body":"Table of Contents A mystery Possible Reasons Comparison CloudFlare Analytics CloudFlare Web Analytics My own analytics tool A mystery CloudFlare give me two different measures of how many people have visited my website, one using their Analytics product and other from their new Web Analytics product. I get very different results for page views and number of visitors. I also get a third distinct set of results from my own analytics tool. Possible Reasons The reasons for this don\u2019t seem to be explained anywhere obvious, but I think it could be caused by ad blockers preventing the JavaScript used for the Web Analytics product. The normal, (not Web) Analytics product might derive its results from server side events, which would catch everything including bots and RSS clients, and be unaffected by ad blockers. Most of my visitors read the technical articles and therefore the audience is probably very technical and likely to be using an ad blocker. If this is the case then one way to test my hypothesis would be to write some articles that appeal to a non-techni audience who are less likely to use an ad blocker. In this case I would expect the two analytics methods to agree more closely. Comparison I made some screen shots at 11pm on June 29\\(^{th}\\ and compared the results from the CloudFlare Analytics, CloudFlare Web Analytics, and my own analytics tool. CloudFlare (normal) analytics say I\u2019ve had 234 unique visitors. But their Web Analytics tool says I\u2019ve had 11 visitors. My own tool reports 12 unique visitors. Why are these results so different? Maybe one measure might be including bots and another might only be trying to report real people using normal browsers, but the difference seems too high for that.1 I\u2019d also expect real usage to fall when its night in the countries I get most traffic from, which I don\u2019t see. Perhaps the difference is caused by 95% of my readers are using an ad blocker. My own analytics tool can\u2019t give results from a rolling 24 hour window, it only groups data by day. Therefore I recorded the values at 11pm, which should be close enough. My simple method of logging IP addresses when a page is loaded and counting the unique IP addresses each day says that I\u2019ve had 12 unique users. Much closer to the CloudFlare analytics beta result, but I wouldn\u2019t expect my bespoke tool to be blocked by an Ad Blocker. If it were as simple as concluding that my own results agree with the CloudFlare analytics beta then that might be enough. But they only agree on this particular metric. I\u2019ve logged 47 page views today using my own tool but the CloudFlare Analytics beta reports only 11 page views2. Please let me know on twitter if you have any ideas! CloudFlare Analytics CloudFlare Web Analytics My own analytics tool if { var align = \"center\", indent = \"0em\", linebreak = \"false\"; if (false) { align = (screen.wi CloudFlare do tell me they\u2019ve blocked 199 attacks in the last month, but I don\u2019t think this explains the difference \u21a9With an average load time of 90ms - pretty snappy if it can be believed. \u21a9"},{"title":"Proverbs\u00a016","category":"Non-technical/Journal","url":"proverbs-16.html","date":"25 June 2021","tags":"books, bible, wisdom ","body":"The first 9 verses seem to loosely be about plans and the motivation for various actions. The wisdom and purpose of God\u2019s actions are contrasted with the motivation of human plans. The first proverb is a bit of a riddle. \\(^{1}\\)Th plans of the heart belong to man, but the answer of the tongue is from the Lord. I get the first part, but why would a spoken answer generally be from God? Is it that even if you are wise, and fear God, your plans are still your own but the way you speak about them is different because you fear God? \\(^{2}\\)Al the ways of a man are pure in his own eyes, but the Lord weighs the spirit. Don\u2019t be surprised that foolish, stupid or evil people think that their actions and decisions are upstanding and good. Understand that God says the spirit (motivatio or attitude) that produced the plan or the actions is what should be judged. I don\u2019t think that God would agree that \u201cthe ends justifies the means\u201d. \\(^{3}\\)Co your work to the Lord and your plans will be establishe A good proverb for fridge magnets. \\(^{4}\\)Th Lord has made everything for its purpose, even the wicked for the day of trouble. The mysteries around moral responsibi from free will and predestina are not unique to the New Testament. I can\u2019t think of anything useful to say in order to expand on this. I believe it\u2019s true, I don\u2019t really know what I would do differentl now that I\u2019ve read this proverb. I can trust God that he knows what he\u2019s doing and that he is good. I\u2019ll add it to my list of things I know about God without taking anything else off the list. who is arrogant1 in heart is an abominatio to the Lord, be assured he will not go unpunished God really doesn\u2019t like arrogance, which seems very similar to pride. This is troubling because society seems to have lost any appreciati for humility to the point where we no longer know how to talk about the virtues and vices of humility and pride. Pride is considered a virtue and is conflated with self-worth We lack the nuance required to reliably discern wisdom from folly. \\(^{6}\\)By steadfast love and faithfulne iniquity is atoned for, and by the fear of the Lord one turns away from evil. It is refreshing and pleasant to read about Steadfast love and faithfulne I don\u2019t hear many people talking about these virtues - love has been reduced or redefined as something that can be found or changed quickly. Faithfulne isn\u2019t celebrated or spoken about very much. Maybe because it isn\u2019t as dramatic as betrayal. Faithfulne and steadfast love are predicated on humility, which is another concept society seems silent about. The second half of the proverb is also profound. It tells me how I can turn away from evil. I know that it is easier to say I will change my ways than it is to actually change, consistent \u201cBeing good\u201d or \u201cturning away from evil\u201d is not nearly as easy or simple as a child thinks it is, and this proverb tells me how to do it. The last proverb of the previous chapter said that the fear of the lord is instructio in wisdom, and that humility comes before honor. \\(^{8}\\)Be is a little with righteousn than great revenues with injustice. In case there was any doubt, here it is. Don\u2019t compromise yourself in order to make more money. \\(^{9}\\)Th heart of a man plans his way, but the Lord establishe his steps. So I can make my own plans, but if my plans are to be successful or substantia I need the Lord to make my plans \u201cfirm\u201d or \u201cpermanent \\(^{10}\\)A oracle is on the lips of a king, his mouth does not sin in judgement. I don\u2019t know what this means. An oracle is \u201ca priest acting as a medium through whom advice or prophecy was sought from the gods.\u201d Kings have certainly sinned when making judgements \\(^{11}\\)A just balance and scales are the Lords, all the weights in the bag are his work. God loves justice, and the instrument or justice are ultimately his, and are from him. \\(^{12}\\)I is an abominatio to kings to do evil, for the throne is establishe by righteousn The kings in verses 12 and 10 are not like kings I have heard about. If 12 is true then perhaps 10 can be true. At a minimum, 12 and 13 provides some standard by which to judge kings. \\(^{16}\\)H much better to get wisdom than gold! To get understand is to be chosen rather than silver. One of the many bits of advice that could come from this would be to prioritise jobs with training and experience over jobs with higher salaries immediatel Especially in your twenties or when you are starting your career. You don\u2019t need lots of disposable income, you do need wisdom, training, experience perspectiv time with wise or experience people. \\(^{17}\\)T highway of the upright turns aside from evil, whoever guards his way preserves his life. Guarding your way - this is not a concept or expression I\u2019ve heard of before. I guess it could mean \u201cdefend the path your life could take\u201d, or \u201cthink about the consequenc (second order \\(^{18}\\)P goes before destructio and a haughty spirit before a fall. God really doesn\u2019t like pride, and also it leads to destructio This is not a coincidenc Haughty means \u201carrogantl superior or disdainful which is mostly a synonym for prideful. \\(^{19}\\)I is better to be of a lowly spirit with the poor than to divide the spoil with the proud. Avoid prideful people, avoid evil people. Proverbs seems really clear that you are supposed to socialise and spend time with people you want to be like. A lowly spirit is \u201clow is status, or humble\u201d. Better to hang out with poor people and be humble than to be rich and associate with prideful people. gives thought to a matter will discover good, and blessed is he who trusts in the Lord. More encouragem to trust God, and to be considerat 21 and 23 both say that it is wise to speak persuasive or encouragin If the words are well intended, then make them more effective by being persuasive \\(^{21}\\)T wise of heart is called discerning and sweetness of speech increases \\(^{23}\\)T heart of the wise makes his speech judicious3 and adds persuasive to his lips. words are like a honeycomb, sweetness to the soul and health to the body.Our words affect our bodies, and our souls. Gracious words are really valuable. Gracious means to be kind, courteous, or patient. Let it go, be gentle. \\(^{25}\\)T is a way that seems right to a man, but its end is the way to death. Don\u2019t be so confident in your own wisdom and judgement. It\u2019s not just that your judgement is a bit less good than God\u2019s, this proverb says its totally opposite. Expect to be doing things that look like the opposite of what someone might expect. Expect to do unintuitiv things. \\(^{26}\\)A workers appetite works for him, his mouth urges him on. True that. is slow to anger is better than the mighty, and he who rules his spirit than he who takes a city. Wow. This is high praise for self control. if { var align = \"center\", indent = \"0em\", linebreak = \"false\"; if (false) { align = (screen.wi Arrogant: Having or revealing an exaggerate sense of ones own importance or abilities. \u21a9Establish Set up on a firm or permanent basis. \u21a9Judicious Having, showing or done with good judgement or sense. \u21a9"},{"title":"I would like to take some time to explore what it means to be\u00a0alive","category":"snippet","url":"i-would-like-to-take-some-time-to-explore-what-it-means-to-be-alive.html","date":"24 June 2021","tags":"life ","body":"."},{"title":"Proverbs\u00a015","category":"Non-technical/Journal","url":"proverbs-15.html","date":"24 June 2021","tags":"books, bible, wisdom ","body":"\\(^{1}\\)A soft answer turns away wrath, but a harsh word stirs up anger. A soft answer can diffuse a volatile situation, and speaking harshly leads to anger. \\(^{2}\\)Th tongue of the wise commends knowledge, but the mouths of fools poor out folly. Knowledge is to be commended. If someone speaks a lot of folly, they are a fool. Folly is \u201cfoolishne or a lack of good sense\u201d - it\u2019ll take discernmen to judge a lack of good sense. Find someone who commends acquiring knowledge, and learning. \\(^{3}\\)Th eyes of the Lord are in every place, keeping watch on the evil and the good. Don\u2019t mistake God\u2019s patience for indifferen know that your good works and faithfulne are seen. \\(^{4}\\)A gentle tongue is a tree of life, but perversene in it breaks the spirit. Yet again, this book teaches that our words are powerful and have great consequenc A gentle tongue, a soft answer, guarded words preserver life, good fruit, fountain of life, now a tree of life. It is unequivoca as is the inverse. \\(^{5}\\)A fool despises his father\u2019s instructio but whoever heeds reproof is prudent. I just had a thought that people younger than 10 would have a difficult time heeding or despising, and are too young to be considered wise or foolish. They are children. When proverbs refers to parents and children, I reckon its probably referring to either adult children and maybe teenagers. I have young kids so I\u2019m predispose to see parenting advice in that context. \\(^{7}\\)Th lips of the wise spread knowledge, not so the hearts of fools. What you say is important. \\(^{8}\\)Th sacrifice of the wicked is an abominatio to the Lord, but the prayers of the upright are acceptable to him. Integrity matters, and it seems that God is more concerned with motivation than impact. Better to pray quietly and honestly than to hypocritic do good works in public. \\(^{11}\\)S and Abaddon lie open before the Lord, how much more the hearts of the children of man! I am guessing that Sheol and Abaddon are some reference to a hellish place - if God can perceive what happens in a completely godless place that is far from him, it is going to be trivial to perceive the thoughts and motivation of your heart. He can read us like an open book. \\(^{12}\\)A scoffer does not like to be reproved, he will not go to the wise. Reproof is almost as strong of a theme as words. Reproof is part of growing up and becoming wise. You must be reproved if you are to learn and become wise. If I do not reprove my children then I am negligent. If you want to avoid reproof or do not accept it then you are a fool. But remember, these are proverbs - they are generally true, most of the time. There will be situations where you parent gives bad advice and you\u2019d be wise to ignore it. But its the exception, not the rule. If everyone else is the problem, then the problem is certainly you. \\(^{13}\\)A glad heart makes a cheerful face, but by sorrow the spirit is crushed. This proverb connection emotions, spirit and the physical body. It says they are linked. If you are happy then you will look happy, and vice versa. If you carry around stress and tension then it\u2019s going to change how you look. Don\u2019t be sad indefinite it will crush you. Grieve, and grow, and move on. \\(^{14}\\)T heart of him who has understand seeks knowledge, but the mouths of fools feed on folly. This isn\u2019t talking about the consequenc of your words, it says that if you have understand then your heart will desire knowledge. If you feed on folly, which is like consuming things that are foolish, then you are a fool. \\(^{15}\\)A the days of the afflicted are evil, but the cheerful of heart has a continual feast. A pessimist and an optimist could experience the same event and become respective more pessimisti and more optimistic As much as you are able, choose to have a cheerful heart. If you are afflicted, then don\u2019t give up hope. Affliction doesn\u2019t mean you did something wrong or lack wisdom. is a little with the fear of the Lord, than great treasure and trouble with it. So the fear of the Lord leads to the avoidance of trouble.. and it is better to live peaceably than deal with trouble. is a dinner of herbs where love is than a fattened ox and hatred with it. Better to eat only garnishes, or bait, with people who love you than fine dining with people who do not. \\(^{18}\\)A hot-temper man stirs up strife but he who is slow to anger quiets contention Don\u2019t lose your temper. Blessed are the peace-make counsel plans fail, but with many advisers they succeed. You\u2019re not supposed to know everything and be completely independen You are supposed to ask for help, weigh the advice, and deliberate look for wise people to be friends with. \\(^{23}\\)T make an apt1 answer is a joy to a man, and a word in season, how good it is! It is fun to say something apt? It is a blessing to receive a bit of well-timed advice. Proverbs talks a lot about words and mouths. For most of history, books have been rare and literacy was not widespread Therefore most knowledge transfer would occur by speaking. Maybe it still does today because it feels like it requires less effort than reading. I think the same principles can be applied to reading and writing as for hearing and speaking. \\(^{25}\\)T Lord tears down the house of the proud but maintains the widow\u2019s boundaries The boundaries are the edges of the land owned by the widow. God really doesn\u2019t like proud people, and is soft-heart towards the vulnerable \\(^{27}\\)H who is greedy for unjust gain troubles his own household, but he who hates bribes will live. Don\u2019t be a fool, don\u2019t be bribed - directly or indirectly Love life, look after your family, and hate bribes. Loving one thing means you hate other things. If you\u2019re building a business, design for Segregatio of Duties. \\(^{28}\\)T heart of the righteous ponders how to answer but the mouth of the wicked pours out evil things. It is good, and Godly, to think about your answer before you say it. \\(^{33}\\)T fear of the Lord is instructio in wisdom, and humility comes before honor. Ok great - I have a definition for what \u201cfear of the Lord\u201d means. (And it\u2019s a reliable definition too.) I still don\u2019t understand why this is what it means, but at least I know what it means. They why question is less foundation that the what. Learn humility, despise pride. Seek wisdom. if { var align = \"center\", indent = \"0em\", linebreak = \"false\"; if (false) { align = (screen.wi Apt: Appropriat or suitable in the circumstan \u21a9"},{"title":"\u201cBelieve half of what you see and now\u2019t of what you\u00a0hear.\u201d","category":"snippet","url":"believe-half-.html","date":"24 June 2021","tags":"quote ","body":"source"},{"title":"Django for Startup\u00a0Founders","category":"snippet","url":"django-for-startup-founders.html","date":"23 June 2021","tags":"django, saas, startups, python ","body":"Better read this article archive"},{"title":"Building my own web\u00a0analytics","category":"Technical/Web","url":"building-my-own-site-analytics.html","date":"22 June 2021","tags":"cloud-functions, data ","body":"I\u2019ve built a simple client-sid website analytics tool for this site, you can see it at /analytics It has the following metrics: Page views per day, Unique IP addresses per day Views per page per day. This article eventually made it to the front page of Hacker News, which resulted in a lot of extra traffic and an opportunit to see how the tool performed under a much heavier load. I wrote about the affects of this and subsequent design changes here. I compare the different results from CloudFlare Analytics, CloudFlare Web Analytics and my own tool in this follow-up article. Motivation Google Analytics Google Analytics felt like overkill. It has so many data-point that the useful metrics are obscured. I also like this site to load quickly and GA makes it slower. CloudFlare Analytics I\u2019ve also tried CloudFlare Analytics. It\u2019s a lot simpler than GA and better suits my use case, but I don\u2019t think its accurate. Design Considerat The analytics should be easy to access and easy to understand I know from my work visualizin data and building dashboards that the metrics presented will alter the users perception of the underlying reality. The way that someone thinks about their impact on a business, the value they\u2019ve produced, or the dynamics of the underlying system (a product\u2019s quality, site performanc growth, etc) is influenced by the design decisions I make, such as which metrics are available, how easy they are to access, or which metrics are above the fold. If I present a particular metric as if its important, it will be difficult for someone who uses the dashboard to resist this implied message. They\u2019ll eventually consider the metric as a Key Indicator of some kind. For these reasons I wanted to see only the most important metrics about my website, and I wanted to see them in a simple way without distractio The only metrics I\u2019m interested in are: How many people are reading my site What are they reading How much are they reading. I\u2019d like to be able to infer whether I have a few people who read a lot, or a lot of people who read a little. (Or, as is the case, a few people who read a little.) Method Motivation The main reason for making my own analytics tool it because its a fun challenge with an obvious and useful result. Building it required connecting a few technologi - Serverless Computing (Cloud Functions on GCP), NoSQL databases (DataStore JavaScript HTTP headers. Assumption I\u2019m assuming that unique IP addresses is a good enough proxy for unique readers, even though I\u2019m not considerin crawlers, bots, or RSS subscriber . Technique The analytics \u201cengine\u201d works by consuming a request that is sent by the client each time a page is loaded. The request is parsed by a Cloud Function on GCP which extracts the page URL and the IP address. This is then recorded in a DataStore database along with the current date and time. Viewing the analytics is as simple (and as complicate as making a request to the database, parsing the data and visualizin it convenient For example, group the data by days and count the distinct IP Addresses to figure out how many people are visiting each day. This is achieved by making a request to another Cloud Function that returns a response with a JSON payload. It\u2019s not a perfect solution, there are edge cases I\u2019m not considerin I expect it to be mostly right and good enough for my purposes. It didn\u2019t take much effort and it was a fun mini project. The hardest part was figuring out chart.js, the slowest part was iterating on the Cloud Functions. Mocking Cloud Functions I haven\u2019t figured out how to easily test cloud functions locally - it would require setting up a NoSQL database and mocking Flask requests and responses. Instead of doing that, I watched Peaky Blinders for a couple of minutes whilst each new version of the Cloud Function was deploying. Improvemen Eventually I\u2019ll want to group the metrics by week or month I expect. It\u2019ll be a good way of learning and playing with cloud technologi and JavaScript Unless someone decides to spam the site, I expect the costs to be less than \u20ac1/month. This site is hosted using CloudFlare so I suppose I could setup some page rules to prevent malicious traffic3 . Tasks for later Make load faster - latency is caused by the Cloud Function initialisi Short of paying actual money for always-on resources I can\u2019t see a way to reduce this. However it\u2019s only an issue if you are the first person to view the page in the last ~10 minutes - this blog post explains whj. Add loading spinners - I used the same snippets as in my Machine Vision demo. Group data by weeks or months as well as day. Identify bots and search engines - the analytics requires JavaScript to be running so I think some types of non-human activity is already filtered. How can I do this? Aggregate the data (once per day) in a Cloud Function instead of repeatedly in the browser. Understand why the DataStore API is called multiple times for a single fetch. Questions I\u2019d be interested to know if there is a way to track RSS subscriber I know that the usual method is to inspect server logs, but this site is hosted on GitHub pages so I don\u2019t think this is possible. To what extent does requiring JavaScript in order to log a page view filter out bots and crawlers? I\u2019ve used the chart.js library because its reasonably fast and lightweigh My preferred library would be Plotly if it could be responsive and fast even if there are >10 charts to render. Has plotly.js improved recently to the point where it wouldn\u2019t cause a browser to lag if multiple plots are being rendered? Finally, it occurs to me that I could make an analytics widget for my desktop using \u00dcbersicht. It could show page views for the current day perhaps. I\u2019ve made a couple of widgets before [1, 2] which were written in CoffeeScri but the newer widgets are written in React, so I guess this is an opportunit to learn4 . Writing the \u201cTime Since\u201d (my daughters birth) and \u201cTime Until\u201d (my next accounting exam5 ) widgets were my first ever taste of CSS, HTML and JavaScript The first ever article on this blog was about the \u201cTime Since\u201d widget. CoffeeScri and Ubersicht were just about simple enough for me to learn by trial and error, copying someone else\u2019s code and changing it bit by bit until I had what I want. Site AnalyticsI Google Analytics it can be fun clicking around on all the things and seeing lots of options, but its not really useful once the novelty has worn off. \u21a9I think this might be quite wrong, but I don\u2019t know why. \u21a9The page is now rate limited to 5 requests per minute per IP address. \u21a9Done! My desktop now looks like this: \u21a9I failed the exam because I\u2019d been working on Ry\u2019s Git Tutorial instead. \u21a9"},{"title":"Alfie\u00a0Solomons","category":"snippet","url":"alfie-solomons.html","date":"22 June 2021","tags":"movie, youtube, peaky-blinders, humanity ","body":"Alfie Solomons - Where the light comes in. Scenes showing the character of Alfie Solomons from movie"},{"title":"Bifurcation\u00a0Theory","category":"snippet","url":"bifurcation-theory.html","date":"17 June 2021","tags":"math, chaos ","body":"Rabbits, fluid convection the Mandelbrot set and lots of others things too. Also known as The video shows how the Feigenbaum Constant is defined. It\u2019s a fundamenta constant I hadn\u2019t come across before -\u00a04.6692.. archive"},{"title":"Apple\u2019s iCloud+ \u201cVPN\u201d","category":"snippet","url":"apple-s-icloud-vpn-.html","date":"16 June 2021","tags":"apple, onion, vpn, icloud ","body":"article \u201cAn Apple onion router. The routing uses two hops; Apple provides the first, and independen third parties (not yet specified) provide the\u00a0second \u201cIn one move, Apple has taken onion routing from a specialize tool for hackers to something that will be in daily\u00a0use."},{"title":"Rich","category":"snippet","url":"rich.html","date":"16 June 2021","tags":"python, console, shell ","body":"python -m rich for a\u00a0demo Cool Python module to handle terminal output with debugging and logging features. It can even record stack trace errors to\u00a0html. repo demo\u00a0video"},{"title":"Practice","category":"snippet","url":"practice.html","date":"16 June 2021","tags":"proverb ","body":"An amateur practices until they can play it correctly, a profession practices until they can\u2019t play"},{"title":"Vim\u00a0Sneak","category":"snippet","url":"vim-sneak.html","date":"15 June 2021","tags":"vim, plugin ","body":"Invoked with s followed by 2\u00a0chars. S F, f, T, and t are enabled to work across\u00a0lin Jump back with ; or , to go to next/previ 5sxy searches for the next instance of xy within 5\u00a0lines. 3dzqt delete up to the third instance of qt. repo"},{"title":"Design\u00a0Patterns","category":"Technical/Engineering","url":"design-patterns.html","date":"15 June 2021","tags":"abstractions, meta, software-engineering ","body":"Design patterns are generalize abstractio that solve common problems and help engineers create complex code reliably and quickly. I first heard about design patterns from Aaron Maxwell in his Powerful Python newsletter and made some notes in an Then the YouTube algorithm put the following video on my front page, and down the rabbit hole I\u00a0went. Take a look at the Borg These are some Java SourceMaki fair repo (and my fork) Some pdf resources, including the GoF\u00a0book: All the GoF\u00a0patter c logica GoF\u00a0catalo securing GoF Wikipedia page about GoF\u00a0book."},{"title":"Proverbs\u00a014","category":"Non-technical/Journal","url":"proverbs-14.html","date":"14 June 2021","tags":"books, bible, wisdom ","body":"\\(^{1}\\)Th wisest of women builds her house, but folly with her own hands tears it down. This chapter begins with a proverb about women! And not by juxtaposin men either. It\u2019s important that I remember that wisdom - the desirable quality that so much of proverbs is about - is personifie as a woman, like in chapter 9:1, Wisdom has built her house, she has hewn1 her seven pillars. walks in uprightnes fears the Lord, but he who is devious in his ways despises him. Fear is contrasted with despising, so I don\u2019t think this is supposed to fear of violence or victimisat but more like a feeling of reverence and profound respect. The Lord loves me, and in the previous chapter it says that parents discipline their children. \\(^{4}\\)Wh there are no oxen, the manger is clean, but abundant crops come by the strength of the ox. This is really interestin the point feels modern. The rural imagery is unusual (to me) but otherwise it feels like something you might find on Instagram. What\u2019s the lesson? Progress is messy, or doing work creates waste, don\u2019t fret about the cleanlines of your manger if you want to have a productive farm.. \\(^{5}\\)A faithful witness does not lie, but a false witness breathes out lies. It\u2019s really important - don\u2019t lie. This is a strong recurring message. \\(^{6}\\)A scoffer seeks wisdom in vain, but knowledge is easy for a man of understand What does he understand That scoffing is harmful to the scoffer, and what else? \\(^{7}\\)Le the presence of a fool, for there you do not meet words of knowledge. It would be good to have a clearer understand of the difference between understand and knowledge. This proverb is very similar to modern sayings like \u201cYou become like the 5 people you spend most time with\u201d. What are the marks of a fool? How do you know if you\u2019re hanging out with a foolish person? They talk a lot, without thinking about their words. They are quick to become angry or emotional. They are not diligent, or consistent hard working. They don\u2019t plan ahead (filling their barns in summer). \\(^{8}\\)Th wisdom of the prudent is to discern his way, but the folly of fools is deceiving. It\u2019s wise, and prudent, to discern a course of action, or series of events and decisions. Discern means \u201crecognise or find out\u201d, I tend to use it to \u201clook closely for additional clues about what might happen\u201d. \u201cPrudent\u201d means to \u201cact or demonstrat care and thought for the future\u201d. \\(^{10}\\)T heart knows its own bitterness and no stranger shares its joy. This, I think, makes total sense to an old person and seems mysterious to young people. When I first read it I thought it said something like \u201cno-one shares its joy\u201d, but it doesn\u2019t. Only \u201cstranger\u201d Family and close friends can share joy, but they still won\u2019t be able to know your bitterness I\u2019d recommend not holding onto bitterness and finding out how its possible to forgive because of the work that Jesus completed. \\(^{11}\\)T house of the wicked will be destroyed, but the tent of the upright will flourish. Again, don\u2019t be wicked, it will go badly for you. If you are upright and living in a humble, fragile, vulnerable tent you can still flourish. \\(^{12}\\)T is a way that seems right to a man, but it\u2019s end is the way to death.This book doesn\u2019t pull any punches. It\u2019s an alarming assertion that someone can think they are doing things \u201cright\u201d but are in fact heading towards death. Don\u2019t rely on your own understand figure out what God thinks about a thing. \\(^{13}\\)E in laughter the heart may ache, and the end of joy may be grief.At the moment I don\u2019t have much experience of grief, but I inevitably will. I know there is sorrow so profound that it becomes physical as well as emotional, and regret that changes what it means to be alive. I guess one of the things this proverb reveals is that its ok to laugh whilst experienci heart ache, and its natural to feel happy and sad (laughing and heart-achi at the same time. It\u2019s part of being alive emotionall and shouldn\u2019t be considered weird or broken. 14 and 15 reiterate themes I\u2019ve noted previously \\(^{16}\\)O who is wise is cautious and turns away from evil, but a fool is reckless and careless. It is wise to not do something, even if you could. You don\u2019t have to do/see/vis all the things, even if you could. It\u2019s ok to risk erring on the side of cautious, because if you are not cautious, you are foolish. I guess that if you are discerning then you can perceive more clearly and what was originally partially known and therefore risky becomes less risky because there are fewer unknowns. Instead of \u201cmaybe its wrong\u201d it can become \u201cits almost definitely wrong, or right\u201d \\(^{17}\\)A man of quick temper acts foolishly, and a man of evil devices is hated. Don\u2019t lose your temper quickly. Losing it slowly is often more difficult. \\(^{18}\\)T simple inherit folly, but the prudent are crowned with knowledge. You\u2019re supposed to aspire to prudence (and more generally, wisdom). You can\u2019t really opt out of this, because if you\u2019re not wise then you\u2019re a fool. If you\u2019re not prudent, you are (too) simple. And bad things happen to fools. \\(^{19}\\)T evil bow down before the good, the wicked at the gates of the righteous. It\u2019s surprising to read such bold and simple confidence that justice would prevail. No ifs or maybes or conditions just a simple resolution at the end. \\(^{20}\\)T poor is disliked even by his neighbour, but the rich has many friends. Timeless pragmatism This proverb is an observatio not a commendati And look at the next proverb. despises his neighbour is a sinner, but the blessed is he who is generous to the poor.. Be generous, not just theoretica or in your thoughts, but also in your actions and your money (see 23). Don\u2019t despise people. Who is your neighbour? who devise good meet steadfast love and faithfulne Steadfast love and faithfulne \\(^{23}\\)I all toil there is profit, but mere talk leads only to poverty. It\u2019s not enough to only talk about loving your neighbour. And don\u2019t be convinced that any hard work you do it totally wasted, apparently it is not. \\(^{24}\\)T crown of the wise is their wealth, but the folly of fools brings folly. Really? Does wealth here mean something more or different to financial wealth? (I think I asked this in a previous proverb.) Not all wise people are rich, this is certain. And many biblical heroes were financiall impoverish Is it just a wide \\(^{25}\\)A truthful witness saves lives.. Literally. Though probably without knowing exactly which lives, and at what moment. Investors refer to this as second order consequenc \\(^{26}\\)I the fear of the Lord one has strong confidence and his children will have refuge. The best that I can understand this is to read it the same as if it said \u201cIn the Lord one has strong confidence What does this proverb mean that my shortened simplified version doesn\u2019t? \\(^{27}\\)T fear of the Lord is a fountain of life, that one may turn away from the snares of death. Turning away is like physically repenting. A fountain of life is a source of life, and probably health and healing. It is contrasted with death traps. It would be interestin to consider the relative importance of feelings and actions in this book. Is it that feelings lead to actions, but actions are what makes the difference - the judgement between wisdom and foolishnes love and hatred, depends (only) on your actions? \\(^{28}\\)I a multitude of people is the glory of a king, but without people a prince is ruined.If a king doesn\u2019t have any subjects, what is he king of? If leaders are not followed by any people, they cannot claim to be leaders. is slow to anger has great understand but he who has a hasty temper exalts folly.If you have a quick temper then you are responsibl for it, and you are endorsing foolishnes Being slow to anger is a sign of wisdom. \\(^{30}\\)A tranquil heart gives life to the flesh, but envy makes the bones rot. Envy is apparently really bad. Tranquilit is freedom from disturbanc or being calm. It\u2019s good for your health. oppresses a poor man insults his maker, but he who is generous to the needy honors him. The poor mater, and have dignity, and are worthy of respect, because we are all made. if { var align = \"center\", indent = \"0em\", linebreak = \"false\"; if (false) { align = (screen.wi Hewn: chop or cut with an const \u21a9"},{"title":"Coding exercise for a technical\u00a0interview","category":"Technical/Data","url":"coding-exercise.html","date":"14 June 2021","tags":"trading, finance ","body":"This notebook is a technical exercise I worked on as part of an interview for a crypto trading firm. The exercises involve building simplified interfaces to parse order book data and calculate various quantities and I find technical interviews that involve live coding exercises to be really useful and really stressful. Live coding definitely triggers \u201cperforman anxiety\u201d for me (when I was a kid I really hated presenting or playing an instrument in front of anyone, even teachers) - I felt self-counc and in this case I needed to use VSCode instead of my familiar Vim+Tmux setup. Consequent progress was really slow and bumpy. Tragically I got really muddled trying to parse some json after the initial API\u00a0reques Whilst thinking about the problem and what code to write, I also needed to think about the\u00a0follow How to communicat my\u00a0thought How to optimize my output for an interview context (it\u2019s not a real problem involving tests, edge cases, scalabilit or How to use VSCode (and a mouse or\u00a0trackpa These additional considerat resulted in me writing code that got the job done.. but slowly. The code wasn\u2019t\u00a0gre I was curious how much faster I\u2019d be (and how much easier the exercise would seem) if I treated the exercise as a \u201ctake home\u201d exercise instead of \u201clive coding\u201d. Here are the results - it took a couple of hours, the ideas flowed more easily, and I could remember methods more accurately I wish I didn\u2019t (still) get so Despite this, I think live coding interviews are a great way of assessing me - they get under my skin and show my worst sides as well as my best. If I\u2019m going to work in a high paced and demanding role then the interview should also have Part 1 The Interface\u00b6 Build an abstractio that, given a pair as an argument (ethusd in our case) fetches the latest orderbook and prints it.\u00b6 In\u00a0[1]: import requests import operator import pprint import pandas as pd import time pp = In\u00a0[2]: def get_data(u r = data = r.json() return In\u00a0[3]: 2. Add some functional that takes a side (bid or ask) and a price p as arguments, and returns the total volume available in the order book at p.\u00b6 In\u00a0[4]: def price): data = # side should be either \"asks\" or \"bids\" side = clean_data = [] for i, j in price_data = volume = volume)) # find all the items where first item is less than or equal to price index = 0 volume = 0 operate = { 'bids': operator.g 'asks': operator.l } op = for i, j in if price): index += 1 volume += else: break return volume In\u00a0[5]: 2470) Out[5]: In\u00a0[6]: 2480) Out[6]: 0 3. Now add some (or modifies existing) functional that takes a percentage and a side as arguments, and returns the volume available between the best price for that side, and that price +/- the percentage In\u00a0[7]: def percent): data = # side should be either \"asks\" or \"bids\" side = clean_data = [] for i, j in price_data = volume = volume)) # find all the items where first item is less than or equal to price index = 0 volume = 0 starting_p = # if side == 'asks': percent *= -1 op = operator.a if side == \"ask\" else operator.s limit_pric = * percent)) # operate = { 'bids': operator.g 'asks': operator.l } op = for i, j in if limit_pric index += 1 volume += else: break # return volume In\u00a0[8]: 0.01) Out[8]: 4. Can you visualize the order book?\u00b6 In\u00a0[9]: asks = bids = In\u00a0[10]: asks = asks.colum = ['price', 'volume', 'timestamp asks['volu = = asks.head( Out[10]: .dataframe tbody tr { middle; } .dataframe tbody tr th { top; } .dataframe thead th { text-align right; } price volume timestamp cumulative 0 2480.36000 0.591 1623670712 0.591 1 2480.48000 0.006 1623670590 0.597 2 2480.93000 0.007 1623670457 0.604 3 2480.96000 0.005 1623670684 0.609 4 2480.99000 0.172 1623670589 0.781 In\u00a0[11]: bids = bids.colum = ['price', 'volume', 'timestamp bids['volu = = bids.head( Out[11]: .dataframe tbody tr { middle; } .dataframe tbody tr th { top; } .dataframe thead th { text-align right; } price volume timestamp cumulative 0 2480.35000 21.903 1623670721 21.903 1 2480.34000 7.930 1623670720 29.833 2 2480.25000 0.250 1623670709 30.083 3 2480.22000 8.061 1623670715 38.144 4 2480.21000 16.113 1623670720 54.257 In\u00a0[12]: ob = In\u00a0[13]: inplace=Tr inplace=Tr In\u00a0[14]: ax = xticks = xticklabel = [l.get_tex for l in Part 2: Time-serie Build an abstractio that takes the A\u00a0pair A unit of time (t) that can be either \u201cseconds\u201d, \u201cminutes\u201d or\u00a0\u201chours\u201d A number of time units\u00a0(n) Given these arguments, it should fetch and store the order book for the pair every n t. For example: if t = \u201csecond\u201d and n = 3, it should fetch the order book every second for a total of 3\u00a0times. In\u00a0[15]: def tunit, n): base_url = pair_mod = results = {} timeunits = { \"s\": 1, \"m\": 60*1, \"ms\": 0.001, } realtime = for i in range(n): snapshot {i+1} of {n}\") r = data = r.json() results[i] = # had to modify by multiplyin by n so that api is more likely to return different data set. print(\"don return results In\u00a0[16]: results = \"s\", 3) getting snapshot 1 of 3 getting snapshot 2 of 3 getting snapshot 3 of 3 done! We see that the hash of the api response is sometimes the same, so we would expect there to be no volume change between these In\u00a0[17]: #results Now extend your code to compare the order books you\u2019ve fetched to each other. Given the name of a pair, n, t and a price p, determine how much the total volume available at p has changed between each \u201cframe\u201d, and between the first and last \u201cframe\u201d.\u00b6 In\u00a0[18]: # redefine \"calc_volu function from above to make data an input param def ob_side, price): # side should be either \"asks\" or \"bids\" side = data[ob_si clean_data = [] for i, j in price_data = volume = volume)) # find all the items where first item is less than or equal to price index = 0 volume = 0 operate = { 'bids': operator.g 'asks': operator.l } op = for i, j in if price): index += 1 volume += else: break return volume In\u00a0[19]: def ob_side, price): snap_vol = [] for key in results: result = ob_side, price) total_chan = - snap_vol[0 = - snap_vol[i for i, j in if i < print(f\"vo in each snapshot: {snap_vol} print(f\"to (diff between last and first): In\u00a0[20]: results = \"s\", 3) getting snapshot 1 of 3 getting snapshot 2 of 3 getting snapshot 3 of 3 done! In\u00a0[21]: #results In\u00a0[23]: \"bids\", 2470) vol in each snapshot: incrementa [-6, 0] total (diff between last and first): -6 Finally, can you think of an efficient way to get a \u201cdiff\u201d of two frames? For example, can you think of a way to determine at which price the volume changed the most?\u00b6 Quick\u00a0idea for each side of each order book, group the orders into buckets with a certain width, e.g. if the lowest ask is 2450, group all the asks from 2450 to 2451, then from 2451 to 2452, etc. Do this for both sides of each order book you\u00a0collec you then have multiple aggregated order books with the same\u00a0index compare the volumes at the same index across each\u00a0order need to force the lowest ask or max bid to be the same in each order book\u00a0snaps need to deal with large\u00a0spre comparison where one side has 0 volume should be marked as suspicious and investigat further to iterate a need to present negative volumes if { var mathjaxscr = = = = ? \"innerHTML : \"text\")] = + \" config: + \" TeX: { extensions { autoNumber 'AMS' } },\" + \" jax: + \" extensions + \" displayAli 'center',\" + \" displayInd '0em',\" + \" showMathMe true,\" + \" tex2jax: { \" + \" inlineMath [ ['$','$'] ], \" + \" displayMat [ ['$$','$$' ],\" + \" true,\" + \" preview: 'TeX',\" + \" }, \" + \" 'HTML-CSS' { \" + \" linebreaks { automatic: true, width: '95% container' }, \" + \" styles: { .MathJax .mo, .MathJax .mi, .MathJax .mn': {color: 'black ! important' }\" + \" } \" + \"}); \"; (document. || }"},{"title":"Axiom","category":"snippet","url":"axiom.html","date":"12 June 2021","tags":"math ","body":"A basic statement that is assumed to be true. E.g: \u201cA straight line can be drawn between any two\u00a0points archive"},{"title":"Foxes and\u00a0Hedgehogs","category":"snippet","url":"foxes-and-hedgehogs.html","date":"12 June 2021","tags":"meta, classification, thinking ","body":"wikipedia Hedgehogs know one big thing, Foxes know many\u00a0thing A classifica system or mental model for writers and\u00a0thinke"},{"title":"Pair programming using\u00a0Vim","category":"snippet","url":"pair-programming-using-vim-tmux-ssh.html","date":"12 June 2021","tags":"pair-programming, vim, tmux, ssh ","body":"blog\u00a0post"},{"title":"Man swallowed by\u00a0whale","category":"snippet","url":"man-swallowed-by-whale.html","date":"12 June 2021","tags":"whale ","body":"A lobster diver was swallowed by a humpback whale. Then it spit him out. Humpback whales don\u2019t have teeth, and have reduced forward vision when they open their mouths to\u00a0feed. article archive"},{"title":"Git LFS","category":"snippet","url":"git-lfs-2.html","date":"11 June 2021","tags":"git, lfs, github-pages ","body":"Key\u00a0comman git lfs install git lfs track \"**/*.mp4\" git lfs ls-files git lfs status track just updates the .gitattrib file. Commit the .gitattrib file with the tracking configurat before committing the large\u00a0file status or ls-files should show the large files in question before you push the commit that starts tracking the large\u00a0file"},{"title":"Your .bashrc doesn\u2019t have to be a\u00a0mess","category":"snippet","url":"your-bashrc-doesn-t-have-to-be-a-mess.html","date":"11 June 2021","tags":"bash, shell, zsh ","body":"Blog post demonstrat how to split a .bashrc file into \u201csubmodule and keep [[ -r ]] && . for file in do [[ -r $file ]] && . $file done unset file"},{"title":"John Kelly finishing the 2017 Barkley\u00a0Marathons.","category":"snippet","url":"barkley-finisher-15-john-kelly.html","date":"10 June 2021","tags":"movie, running, barkley, youtube ","body":"The exhausted moments after running for almost 60 hours through movie"},{"title":"A Project of One\u2019s\u00a0Own","category":"snippet","url":"paul-graham-a-project-of-ones-own.html","date":"8 June 2021","tags":"paul-graham, meta, learning, school ","body":"An essay called A project of ones own by Paul\u00a0Graha being pushed into a task vs being\u00a0pull skating vs\u00a0walking"},{"title":"Proverbs\u00a013","category":"Non-technical/Journal","url":"proverbs-13.html","date":"8 June 2021","tags":"books, bible, proverbs, wisdom ","body":"\\(^{1}\\)A wise son hears his father\u2019s instructio but a scoffer does not listen to rebuke. The chapter opens the same way as the previous chapter - with a proverb about the importance of wisdom, knowledge or instructio As always, the proverb starts with the good example and then contrasts against it. \\(^{2}\\)Fr the fruit of his mouth a man eats what is good, but the desire of the treacherou is violence. Slightly weird imagery I think, my mouth produces fruit that I can then eat.. Weird. But I can see the principle - the fruit of my mouth is the words I speak, words are powerful for either good or evil. Speaking well will lead to good things that do what good food does - nourish, strengthen sustain. Treacherou people desire violence. So are people who want to do violent things likely to betray? Maybe. \\(^{3}\\)He who guards his mouth preserves his life, he who opens wide his lips comes to ruin. Choose your words carefully. \\(^{5}\\)Th righteous hates falsehood, but the wicked brings shame and disgrace. It\u2019s OK to be strongly opposed to be falsehoods - lies, manipulati duplicity. guards him whose way is blameless, but sin overthrows the wicked. I guess there are (at least) 2 ways of looking at this - the only way I can be blameless is by Jesus\u2019 imputed righteousn which also saves me literally from death. The alternativ is that I am not righteous and am overcome by sin. Alternativ and more prosaicall try to be blameless instead of wicked, and instead of creating trouble for yourself you\u2019ll find people bear with you for longer and are nicer to you. \\(^{7}\\) One pretends to be rich, yet has nothing. Another pretends to be poor, yet has great wealth. (I prefer not to use semicolons I really like this proverb! The last 20 years have seen several degrees of financial wealth or lack, and the reality has often been very different to what I\u2019d expected. I often intuit that things must get easier, or nicer, or more fulfilling if only a certain problem was removed, or a certain something was bought, or recognised or achieved. The reality is a lot more complicate than that. Wealth isn\u2019t what I thought it was, I thought it was mostly financial. Now, I think its freedom. Freedom to be at peace, to be unburdened of my past and present, freedom to still have hope for the future, freedom to have some spare time, freedom to not be crushed by life and be a good dad and husband and friend. \\(^{8}\\)Th ransom of a man\u2019s life is his wealth, but a poor man hears no threat. oh! Incisive! This reminds me of \u201cno one ever really owns a fortune, it always seems to own them\u201d. There are various troubles and burdens you avoid by being poor, like choosing life instead of material excess. Lose your wealth and choose a rich life. \\(^{10}\\)B insolence1 comes nothing but strife, but with those who take advice is wisdom. Disrespect and rudeness never works, wisdom is acquired when you make a habit of asking for advice. gained hastily will dwindle, but whoever gathers little by little will increase it. I\u2019m surprised to read this - I didn\u2019t realise that the rate of change of ones wealth would likely effect how it can be sustained, and I wouldn\u2019t have guessed that this is some general principle that exists across millennia, cultures and geographie \\(^{12}\\)H deferred makes the heart sick, but a desire fulfilled is a tree of life. This one I\u2019ve heard before. I don\u2019t really understand it, though. I\u2019m not sure what a sick heart definitely is. It seems because we are told to keep on looking forward to and hoping for Jesus to complete his work saving us and bringing history to completion There is a lot of \u201cnow and not yet\u201d tensions in between the time of Jesus\u2019 resurrecti and second coming (I know so little about it that I don\u2019t particular want to mention it). So maybe a Christian\u2019 heart would be expected to be a little sick? Also, if a desire fulfilled is a tree of life then I guess I\u2019m desiring the wrong kind of things - most of my desires are chasing after the wind. I desire to eat and then I get hungry again, desire to graduate and then realise the work hasn\u2019t even begun, etc. The image of a Tree of life hasn\u2019t been used before I don\u2019t think, maybe it means something specific which I\u2019m unaware of. There was a tree of life in the garden of Eden\u2026 13 - 16: Don\u2019t ignore wisdom, don\u2019t be an idiot, be prudent, have good sense. Strong recurring themes. \\(^{16}\\)E prudent man acts with knowledge, but a fool flaunts his folly is interestin though - I think prudence is being used in a slightly different sense to what I\u2019m used to. This proverb is saying that one should be informed before acting. And because a fool rushes into action, it will be clear that the don\u2019t know what they are doing or talking about. \\(^{22}\\) A good man lays up an inheritanc for his children\u2019s children, but the sinner\u2019s wealth is laid up for the righteous. I remember reading this when I was about 18 and thinking this was quite a burden - to leave an inheritanc for grandchild as well as children. Seventeen years later it makes a lot of sense - the inheritanc isn\u2019t primarily money, but wisdom, peace, security. The way that I parent my children will directly effect how they parent their own children, and in this way I will either bless or curse my grandchild If I damage my children, they will suffer and be less able to provide for their own children (emotional physically spirituall Our own childhoods have a large influence on our adulthoods and our ability to parent, so I should make sure my children have good childhoods which is not a trivial endeavour. 23 and 24 break the pattern of \u201cgood example, bad example.\u201d \\(^{23}\\) The fallow2 ground of the poor would yield much food, but it is swept away through injustice. The missed opportunit are real. Poor people are not incapable of productive and fruitful work. It is injustice that prevents fruitfulne I find this proverb subtly provocativ and insightful \\(^{24}\\) Whoever spares the rod hates his son, but he who loves him is diligent to discipline him. Parents who love their children do not enjoy causing them stress or discomfort but if they love them and are wise then they discipline diligently or consistent Apparently physical punishment is timeless? \\(^{25}\\) The righteous has enough to satisfy his appetite, but the belly of the wicked suffers want. The last proverb in the chapter finishes with encouragem that it is better to be righteous than wicked, because it leads to satisfacti rather than want. The previous chapter also finished with encouragem specifical about righteousn Even though the book doesn\u2019t have headings it seems like it does have structure, and the author requires you to actually read the text closely in order to pull out meaning. Shocking. Clearly not optimised for engagement SEO or social media. if { var align = \"center\", indent = \"0em\", linebreak = \"false\"; if (false) { align = (screen.wi Insolence: rude and disrespect behaviour\u201d \u21a9Fallow: ploughed and harrowed but left for a period without being sown in order to restore its fertility or to avoid surplus production \u21a9"},{"title":"Choose Boring\u00a0Technology","category":"Technical/Engineering","url":"choose-boring-technology.html","date":"7 June 2021","tags":"advice, management ","body":"I\u2019m surprised I haven\u2019t posted this before because the \u201cchoose boring tech\u201d article by Dan McKinley made a big impression when I first read it, about 3 years\u00a0ago. humorous slide show\u00a0versi Key takeaways, based on my memory of reading it a couple years\u00a0ago: Boring tech is that which is mature enough and that you know well enough that you are familiar with its shortcomin and that will let you build You get 3 innovation tokens for each project or stack. Choosing some new and exciting bit of tech requires spending 1 of those\u00a0toke A nice alternativ title for this is \u201cHow to be old, for young people\u201d, which is In addition to the advice in the article, I read somewhere else that the probabilit of something continuing to exist in future, and be actively maintained and supported, is approximat the amount of time that it has already existed, and been supported and maintained I\u2019m not even sure that makes sense, but it bears This should be on my imaginary list of \u201cthings I should read every couple of years\u201d. Actually, a few lists might be really useful. A list of python articles, a list of advice articles, etc. It would be similar to the books page, which is just a list of book\u00a0artic"},{"title":"An incomplete list of skills senior engineers\u00a0need","category":"Technical/Engineering","url":"an-incomplete-list-of-skills-senior-engineers-need.html","date":"7 June 2021","tags":"advice, management ","body":"This is copied from Camille Fournier\u2018s article on medium. I copied it rather than linked to it because who knows if and when Medium will change the URL or put the content behind a\u00a0paywall. Highlights are simply an impulsive \u201cI can think about this for the next few months.\u201d After that I might change\u00a0the An incomplete list of skills senior engineers need, beyond\u00a0cod For varying levels of seniority, from senior, to staff, and\u00a0beyond How to run a meeting, and no, being the person who talks the most in the meeting is not the same thing as running\u00a0it How to write a design doc, take feedback, and drive it to resolution in a reasonable period of\u00a0time How to mentor an early-care teammate, a mid-career engineer, a new manager who needs How to indulge a senior manager who wants to talk about technical stuff that they don\u2019t really understand without rolling your eyes or making them feel\u00a0stupi How to explain a technical concept behind closed doors to a senior person too embarrasse to openly admit that they don\u2019t understand How to influence another team to use your solution instead of writing their\u00a0own How to get another engineer to do something for you by asking for help in a way that makes them feel appreciate How to lead a project even though you don\u2019t manage any of the people working on the project How to get other engineers to listen to your ideas without making them How to listen to other engineers\u2019 ideas without How to give up your baby, that project that you built into something great, so you can do something\u00a0 How to teach another engineer to care about that thing you really care about (operation correctnes testing, code quality, performanc How to communicat project status How to convince management that they need to invest in a non-trivia How to build software while delivering incrementa value in the\u00a0proces How to craft a project proposal, socialize it, and get buy-in to execute\u00a0it How to repeat yourself enough that people start to listen How to pick your\u00a0battl How to help someone get\u00a0promot How to get informatio about what\u2019s really happening (how to gossip, how to\u00a0network How to find interestin work on your own, instead of waiting for someone to bring it to\u00a0you How to tell someone they\u2019re wrong without making them feel\u00a0asham How to take negative"},{"title":"Proverbs\u00a012","category":"Non-technical/Journal","url":"proverbs-12.html","date":"2 June 2021","tags":"bible, proverbs, wisdom ","body":"The chapter starts with \u201cWhoever loves discipline loves knowledge, but he who hates reproof is stupid.\u201d and ends with \u201cIn the path of righteousn is life, and in its pathway there is no death.\u201d The opening proverb is jarring to read, maybe because unlike \u201cwickednes or which are concepts I don\u2019t hear or think much about day-to-day \u201cdisciplin and \u201cknowledge are very familiar and part of contempora conversati So is stupidity. So this proverb What does it mean? If I love discipline then I am loving knowledge, and in this case I\u2019m confident the opposite is also true - If I avoid or dislike discipline then I avoid or dislike knowledge. Discipline is hard, I guess I need to lean into being discipline and remember\u00a0w Encouragin the proverb is immediatel followed by the reminder that \u201cA good man obtains favour from the Lord, but a man of evil devices he condemns\u201d, so make an effort, do good things, and obtain favour. That\u2019s a big reason to persevere \\(^4\\)\u201cAn excellent wife is the crown of her husband, but she who brings shame is like rottenness in his bones.\u201d If it were me I\u2019d leave off all the negative second parts of these proverbs, its abrasive. But also if it were me I doubt they\u2019d be Why is the pattern of all these proverbs \u201cgood thing, bad thing\u201d - to make the contrast stronger? To know why to do a thing and also why not to do the opposite? To make it more punchy? I guess if its abrasive you are more likely to remember it, and if you\u2019re confident that what you are saying is true then you want it to be\u00a0remembe Back to the proverb, this is a great proverb - I like it. It honors wives, though it does put them in the context of their husbands, and it also speaks to how valuable and precious a good wife is - crowns are precious and\u00a0rare. Empiricall it also feels thruthy - marriages are difficult and if either spouse can be considered \u201cexcellent then that is unusual and valuable. Relationsh can give a lot of life and health and happiness, and require a lot of effort and work. To be an excellent spouse is certainly worth desiring and celebratin The next few verses contrast wickedness and righteousn - don\u2019t be wicked, be\u00a0upright \\(^9\\) \u201cBetter to be lowly and have a servant that to play the great man and lack bread\u201d. Don\u2019t put appearance before substance? Don\u2019t spend money on fancy cars or clothes if you wont have enough for decent food? Prioritize substance (truth?) over appearance and social pressures? Having a servant (or a domestic helper) would certainly be helpful, and of more tangible benefit than receiving shout-outs at \\(^{10}\\) \u201cWhoever is righteous has regard for the life of his beast, but the mercy of the wicked is cruel.\u201d - animal welfare is\u00a0importa \\(^{11}\\) Whoever works his land will have plenty of bread, but he who follows worthless pursuits lacks sense. 11 and 12 are both about enjoying the good consequenc of honest work, 11 advises against worthless pursuits, and 12 says that is a wicked to covet the profits of\u00a0evildoe Verses 13 - 19 are all about speaking and listening. Each proverb contrasts good and evil or wisdom 13 - your own dishonest words will become a trap for\u00a0you. 14 - you words and your works will come back to you - invest in them and dividends will be\u00a0returne 17, 18 - these contrast each other. Your words are powerful, speaking the truth is honest (even when it isn\u2019t simple, or if it makes a situation more complicate and being rash (acting without careful considerat of the consequenc can be as violent as wielding a sword. Wise words bring \\(^{15}\\) Fools think they are right, and because of the contrastin pattern presumably do not listen to (or ask for) advice. This seems similar to the Dunning-Kr Effect Wise people know that they don\u2019t know everything they need to, ask for advice, and listen to\u00a0it. \\(^{16}\\) I had to look up Vexation - it means \u201cthe state of being annoyed, frustrated or worried\u201d. Wise people can stay calm and ignore insults. If you get all fired up when someone insults you you\u2019re probably (generally speaking) being\u00a0fool \\(^{19}\\) Truthful lips endure forever, but a lying tongue is but for a moment. Truth endures, lies do\u00a0not. Verses 20 - 29 have two themes - the benefits of consistent hard work, and the benefits of being truthful. As usual, each proverb is a comparison of right and wrong or wisdom \\(^{21}\\) No ill befalls the righteous, but the wicked are filled with trouble. This is encouragin to read, and also highlights that proverbs highlight truthful patterns, but are not specific guarantees - Obviously righteous people will have trouble, this proverb isn\u2019t saying that life will be perfectly pleasant. As usual, it is a push towards doing good because of the benefits, a pull away from doing wrong because of the\u00a0damage \\(^{22}\\) .. those who act faithfully are his delight. It is wonderful to read that (a good) God would delight in\u00a0people. \\(^{23}\\) A prudent man conceals knowledge, but the heart of a fool proclaims knowledge. I find it surprising to read that concealing knowledge can be a desirable trait. I guess there are certain questions that must be answered before the answers to other questions can be understood and I know that for my young children I wouldn\u2019t answer particular questions that same as if they were\u00a0adult \\(^{24}\\) The hand of the diligent will rule, while the slothful will be put to forced labor. Its ironic, and that those who are diligent and able and willing to labor end up in management or leadership positions, and that those who would avoid laborious work end up doing\u00a0it. \\(^{25}\\) Anxiety in a man\u2019s heart weighs him down, but a good word makes him glad. I find it encouragin (again) to read that being weighed down by anxiety is normal, and I like that anxiety isn\u2019t described as a weakness or as foolishnes (I don\u2019t want to conjecture too much about what isn\u2019t written, but there\u2019s a long list of things that fools do and the consequenc of foolishnes and anxiety isn\u2019t on the list.) This proverb also emphasises and encourages the impact of good\u00a0words \\(^{28}\\) In the path of righteousn is life, and in its pathway there is no death. How wonderful to know how to find life and avoid death. It\u2019s such a pleasing way to end a\u00a0chapter. if { var align = \"center\", indent = \"0em\", linebreak = \"false\"; if (false) { align = (screen.wi"},{"title":"Proverbs\u00a011","category":"Non-technical/Journal","url":"proverbs-chapter-11.html","date":"1 June 2021","tags":"books, bible, proverbs, wisdom ","body":"Table of Contents Context Notes Examples Conclusion Context These are my notes from reading Proverbs, one of the books in the bible. Almost the entire book is a collection of proverbs - short sayings that are generally true. I presume there is some structure and themes in their arrangemen taking notes should help these become clearer and hopefully also help me remember and apply them\u00a0bette The book is split up into chapters, which are quite short sections of the book. It would probably take about a minute or two to read each one. The chapters are further split up into verses. In Proverbs each verse seems to be one\u00a0senten Numbering the text like this is useful because it lets you refer precisely to a part of the book. Proverbs does have a few headings, but they are too far apart to provide enough structure on their own, I\u00a0think. I\u2019ve read this book a couple of times before, and appreciate it even when I could feel that I was missing most of the wisdom in it. The book is quite easy to read - the sentences are short and the analogies seem simple\u00a0eno It\u2019s refreshing to read something that has existed for thousands of year, has withstood criticisms and feels approachab It\u2019s practical, despite being old and written in a completely different cultural context. I find Notes Each proverb contrasts justice and injustice using a variety of phrases and images. Pride and humility, integrity and crookednes righteousn In the context of these qualities, various situations are described: a false balance, disgrace, The proverbs are somewhat exaggerate which seems reasonable given that they are self contained single sentences tasked with defining and then resolving a problem. The imagery is clear, it does not rely on\u00a0subtlet The biggest themes are that honesty, integrity, righteousn are to be valued, and their opposites are to be avoided at all costs. Each of these qualities are in fact actions, not sentiments and the consequenc of these actions are reliable and consistent and inevitable The advice to \u201clove your neighbor like yourself\u201d is a good summary of many of the\u00a0prover Its clarity and confidence is encouragin Do good and good things will happen, do evil and the consequenc are inevitably bad for\u00a0you. Examples The fruit of the righteous is a tree of life, and whoever captures souls is wise - the first part sounds wonderful. Who wouldn\u2019t want to produce life? The second part is surprising What is the The desire of the righteous ends only in good, the expectatio of the wicked in wrath. - this is\u00a0encoura One gives freely, yet grows all the richer; another withholds what he should give, and only suffers want. - Like a gold ring a pig\u2019s snout is a beautiful woman without discretion - this doesn\u2019t seem to fit in with any proverbs around it. Is this the only mention of a woman in this section? Why is it important enough to be included? Does this imply that beauty is discreet? There is surely a big cultural gap between Amsterdam in 2021 and where ever this was first\u00a0writ A man who is kind benefits himself, but a cruel man hurts himself. - this is good to\u00a0know Conclusion Don\u2019t lie, cheat, steal or plot. Treat people like you yourself would want to be\u00a0treated Make an effort to do good and increase justice, and things will go well for\u00a0you. God delights in blameless people, and he abhors people with Beauty without discretion is an"},{"title":"Performance Optimizations for the shell\u00a0prompt","category":"snippet","url":"performance-targets.html","date":"28 May 2021","tags":"shell ","body":"Something should happen within 100ms of the users input in order to maintain a feeling If something happens within 50ms of the trigger event, it will feel Also, checkout hyperfine for Lots of useful tips in the original blog post."},{"title":"Vim spellcheck\u00a0commands","category":"snippet","url":"vim-spell.html","date":"27 May 2021","tags":"vim ","body":"[s or ]s \u2192 go to next/previ bad\u00a0word z= \u2192 list of\u00a0suggest zg \u2192 add word to good word\u00a0list zug \u2192 remove word from good word\u00a0list zw \u2192 add word to bad word\u00a0list zuw \u2192 remove word from bad word\u00a0list"},{"title":"Questions for good\u00a0references","category":"Non-technical/Entrepreneurship","url":"questions-for-good-references.html","date":"27 May 2021","tags":"marketing ","body":"Use the following questions to get What was the problem you had before you used the solution I What did the frustratio feel like as you tried to solve the\u00a0proble What was different about the solution I worked on compared to Take us to the moment you realized our solution was actually working to solve the\u00a0proble What does life look like now that the problem is being\u00a0solv Customize the text as necessary, for products or services, complete solutions are a component of a I made a template. Source: Building a story\u00a0bran"},{"title":"Lessons from 45 years in the software\u00a0industry","category":"Technical/Engineering","url":"45-years-in-software.html","date":"27 May 2021","tags":"advice ","body":"An article from a recently retired software engineer about lessons learned over 4\u00a0decades. Beware the curse of\u00a0knowled Focus on the fundamenta Teamwork, Trust, Testing, Communicat Code\u00a0Desig Simplicity Seek first to\u00a0underst Beware lock-in - cost of change, Be honest and acknowledg when you don\u2019t fit the\u00a0role"},{"title":"Modeling Credit\u00a0Risk","category":"Technical/Data","url":"credit-data.html","date":"25 May 2021","tags":"finance, precision, recall, colinearity, variance-inflation-factor, logistic-regression, confusion-matrix, f-score, f1, f2 ","body":"Data Exploratio Exercise\u00b6 Using whichever methods and libraries you prefer, create a notebook with the\u00a0follow Data preparatio and Identify the three most significan data features which drive the credit\u00a0ris Modeling the credit\u00a0ris Model validation and evaluation using the methods that you find correct for the\u00a0proble Your solution should have instructio and be For instance, If your choice is a python notebook, your notebook should install all the required dependenci to run\u00a0it. Import and preparatio \u00b6 In\u00a0[1]: # display more than 1 output per cell from import = \"all\" In\u00a0[2]: %%capture import sys -m pip install --upgrade pip -m pip install pandas numpy sklearn matplotlib seaborn statsmodel In\u00a0[3]: import pandas as pd = None # show all columns in a pandas dataframe In\u00a0[4]: data = data.head( Out[4]: .dataframe tbody tr { middle; } .dataframe tbody tr th { top; } .dataframe thead th { text-align right; } duration credit_his purpose credit_amo savings_st employment other_part age housing job num_depend own_teleph foreign_wo class 0 <0 6 critical/o existing credit radio/tv 1169 no known savings >=7 4 male single none 4 real estate 67 none own 2 skilled 1 yes yes good 1 0<=X<200 48 existing paid radio/tv 5951 <100 1<=X<4 2 female div/dep/ma none 2 real estate 22 none own 1 skilled 1 none yes bad 2 no checking 12 critical/o existing credit education 2096 <100 4<=X<7 2 male single none 3 real estate 49 none own 1 unskilled resident 2 none yes good 3 <0 42 existing paid 7882 <100 4<=X<7 2 male single guarantor 4 life insurance 45 none for free 1 skilled 2 none yes good 4 <0 24 delayed previously new car 4870 <100 1<=X<4 3 male single none 4 no known property 53 none for free 2 skilled 2 none yes bad Data Exploratio In\u00a0[5]: ### overview of the dataset data.shape data.colum data.nuniq # unique values per column print(f'cl \"good\" rows') print(f'cl \"bad\" rows') Out[5]: (1000, 21) Out[5]: 'duration' 'purpose', 'employmen 'age', ' 'housing', 'job', ' 'class'], Out[5]: 4 duration 33 credit_his 5 purpose 10 credit_amo 921 savings_st 5 employment 5 4 4 other_part 3 4 4 age 53 3 housing 3 4 job 4 num_depend 2 own_teleph 2 foreign_wo 2 class 2 dtype: int64 class: 700 \"good\" rows class: 300 \"bad\" rows In order to visually inspect the data it\u2019s necessary to convert the datatype of the categorica features from string to category. This will also be necessary to train the model. Therefore the data will be formatted before Outcome there would be an approximat equal number of outcome classes. In this dataset the split is 30% \u201cbad\u201d and 70% \u201cgood\u201d. The low number of bad credit risk assessment may limit the models ability to accurately predict a bad credit assessment relative to good assessment because of the limited training examples. This could result in more False Positives than would typically be\u00a0expecte Data formating\u00b6 In\u00a0[6]: # remove whitespace around column names data.colum = [col.strip for col in data.colum In\u00a0[7]: # Categorica variables have a limited and usually fixed number of possible values. # Categorica data might have an order (e.g. \u2018strongly agree', \u2018agree\u2019, 'disagree' 'strongly disagree') = [ 'purpose', 'housing', 'job', 'class' ] for col in data[col] = = [ [\"no checking\", \"<0\", \"0<=X<200\" \">=200\"], True), known savings', '<100', '100<=X<50 '500<=X<10 '>=1000'], True ), ('employme ['unemploy '<1', '1<=X<4', '4<=X<7', '>=7'], True), ] for col in In\u00a0[8]: # convert categories to numnerical values, for SelectKBes cat_column = = x: x.cat.code In\u00a0[9]: # all columns are now either categorica and encoded as an int (ordered or unordered) or numerical. data.dtype Out[9]: int8 duration int64 credit_his int8 purpose int8 credit_amo int64 savings_st int8 employment int8 int64 int8 other_part int8 int64 int8 age int64 int8 housing int8 int64 job int8 num_depend int64 own_teleph int8 foreign_wo int8 class int8 dtype: object In\u00a0[10]: # this will take a while.. import seaborn as sns # Create the default pairplot pairplot = sns.pairpl data, hue=\"class diag_kind = 'kde', plot_kws = {'alpha': 0.6, 's': 80, 'edgecolor 'k'}, height = 3 ) fig = pairplot.f dpi=200) # default dpi is 100 Visual inspection plot above compares each feature to each of the other features. The plots along the diagonal show the density plot of that feature, grouped by the value of the \u201cclass\u201d feature that the model will\u00a0predi Inspecting the pairplot shows that each of the features has approximat the same distributi for each class\u00a0valu Of the continuous features, Credit Amount is most heavily left skewed. Age is also left\u00a0skewe If may improve the model to log transform credit_amo and age, particular when predicting credit class when credit_amo or age values are in the middle of their This could be done as\u00a0follows = In\u00a0[11]: # Split the data into features and outcome. X = # all columns except the \"class\" column y = # only the \"class\" columns Check for emerges when three or more model variables are highly correlated can emerge even when isolated pairs of variables are not\u00a0coline The Variance Inflation Factor (VIF) is a measure of colinearit among predictor variables. It is calculated by dividing the variance of all model betas by the variane of a single\u00a0bet In\u00a0[12]: from import In\u00a0[13]: vif = pd.DataFra vif[\"VIF Factor\"] = i) for i in = X.columns vif = Factor'], vif.round( Out[13]: .dataframe tbody tr { middle; } .dataframe tbody tr th { top; } .dataframe thead th { text-align right; } VIF Factor features 18 2.11 own_teleph 0 2.12 11 2.50 5 2.53 savings_st 16 3.56 job 8 3.59 3 4.41 purpose 4 5.09 credit_amo 14 5.29 housing 13 5.59 2 5.85 credit_his 6 5.85 employment 1 7.70 duration 15 7.89 10 8.71 7 9.77 17 11.59 num_depend 12 13.19 age 9 17.03 other_part 19 23.45 foreign_wo The results show that there is a significan amount of in the\u00a0datase We are about to identify the features that drive the credit risk decision the most, this will reduce colinearit random noise and Once we know which subset of features contribute most to the predictive value of the model we could check for colinearit within that\u00a0subse Identify the three most significan features which drive credit risk\u00b6Featu selection is important because it removes or redundant predictors from the model. This improves model performanc and reduces computatio resource requiremen Various statistica tests could be used to find the most significan features, and can be placed into two braod categories - supervised (using domain knowledge and considerin the relationsh of the feature to the target variable) and unsupervis (which ignores the feature\u2019s relevance to the The ANOVA F-value method is appropriat for numerical inputs and categorica outputs. The Chi^2 test is appropriat for categorica inputs with Feature significan is evaluated using both\u00a0metho In\u00a0[14]: X = # all columns except the \"class\" column y = # only the \"class\" columns In\u00a0[15]: from import SelectKBes chi2, f_classif In\u00a0[16]: %%capture = 3 y) cols = new_data = In\u00a0[17]: Out[17]: ('duration ('purpose' ('employme ('age', ('housing' ('job', In\u00a0[18]: %%capture = 3 y) cols = new_data = In\u00a0[19]: Out[19]: ('duration ('purpose' ('employme ('age', ('housing' ('job', The three most significan features\u00a0a Duration (ANOVA: 48, Chi2:\u00a0321) Checking Status (ANOVA: 40, Chi2:\u00a036) Credit Amount (ANOVA: 24, Chi2:\u00a05826 The most significan feature is Duration, followed by Checking Status and Credit Amount. This makes sense intuitivel because Duration and Credit Amount must be proportion to risk. Checking Status is unfamiliar and I don\u2019t know what it means, in a real scenario I would speak to stakeholde or team members to learn more about this\u00a0featu In\u00a0[20]: = 'duration' X = In\u00a0[21]: vif = pd.DataFra vif[\"VIF Factor\"] = i) for i in = X.columns vif = Factor'], vif.round( Out[21]: .dataframe tbody tr { middle; } .dataframe tbody tr th { top; } .dataframe thead th { text-align right; } VIF Factor features 2 1.69 0 3.84 credit_amo 1 4.46 duration Using an arbitrary but common VIF threshold value of 5, we see that none of the 3 most significan features are colinear. Hoever duration is close with a score of 4.46 and credit_amo is second with 3.84. Intuitivel it seems reasonable that credit_amo and duration might have some colinearit as they could be generally expected to be inversely correlated to each other. - You could borrow more for a short period of time than you could for a Optimize the number of features to use\u00b6After an initial demonstrat of the metrics used to assess model performanc the optimal number of features to use will In\u00a0[22]: # used later = x:x[1], reverse=Tr Modeling Credit Risk\u00b6We use logistic regression because this is a binary classifica problem with multiple predictor variables that are Logistic Regression is a reasonable first choice of model type to use, but in a longer project we could use multiple model types and compare their performanc against one another, then select the classifier which performs the best. Other types could include k-nearest neighbors, decision trees, or Support Vector Logistic regression makes the No among the independen variables. can be tested using the Variance Inflation Factor (VIF). No Independen of errors (residuals or no significan The residuals should not be correlated with each other. This can be tested using the Durbin-Wat test, but this is out of scope for this\u00a0exerc The sample size should be large (at least 50 observatio per independen variables are recommende -> 20 $\\times$ 50 =\u00a01000 Splitting the data into training, validation and testing sets\u00b6A validation set is used in addition to training and testing. This is because I expect to optimize model parameters by comparing the performanc of different models using a dataset they weren\u2019t trained on. If this were the same dataset that the final performanc metric was based on then the model would be Instead, the performanc of optimizati will be quantified using the validation set, and the optimal models performanc will be quantified using the testing\u00a0se This gives better prediction of actual performanc against new production data (outside the available dataset) than only using training and testing data\u00a0sets. In\u00a0[23]: from import from import In\u00a0[24]: # 70% training, 15% validation 15% testing is reasonable # use twice, first to split out a testing set, then to split the remainder into validation and training X_train, X_test, y_train, y_test = y, X_train, X_validati y_train, y_validati = y_train, assert len(X_trai == len(y_trai and == and len(X_test == len(y_test examples: examples: examples: Training examples: 700 Validation examples: 150 Testing examples: 150 In\u00a0[25]: from collection import Counter y_train_st = ratio = set has negative values and positive values \") print(f\"re ratio is approximat training set has 222 negative values and 478 positive values result ratio is approximat 32:68 The training set has over twice as many positive values as negative values. This is similar to the complete data set which has 700 \u201cgood\u201d credit class examples and 300 \u201cbad\u201d credit Training the logistic regression model\u00b6The model is trained on the In this simple first case, we evaluate performanc (discussed below) using the test data set. After an initial discussion of the results, the performanc of different models is evaluated using a validation data\u00a0set. In\u00a0[26]: logreg = y_pred = Model evaluation model could be evaluated in many different ways, each with different strengths and weaknesses depending on the of the data and the Where possible, its preferable to use only a single metric to quantify and compare model performanc because its simpler (and less error prone) than if there are multiple metrics to\u00a0conside There are 4 basic categories of results that the model could\u00a0prod True Positives (TP) False Positives (FP) True Negatives (TN) False Negatives (FN) These can be combined to calculate the model\u2019s Precision and\u00a0Recall Optimizing Precision minimizes false positives but ignores Optimizing Recall minimizes false negatives but ignores P = True Positives / (True Positives + False Positives) -> if False Positives are 0, P=1 R = True Positives / (True Positives + False Negatives) -> if False Negatives are 0, R=1 Precision is a ratio of the number of true positives divided by the sum of the true positives and false positives. It describes how good a model is at predicting the positive class. It is the ability of the classifier not to label as positive a sample that is\u00a0negativ High precision implies fewer Precision = True Positives / (True Positives + False Positives) Recall is calculated as the ratio of the number of true positives divided by the sum of the true positives and the false negatives. It quanitifie the ability of the model to find all the High recall implies fewer Recall = True Positives / (True Positives + False Negatives) An ideal model will have good precision and recall. The F-Score provides a way to combine precision and recall into a single measure that captures F-Measure = (2 * Precision * Recall) / (Precision + Recall) The influence of precision and recall relative to each other can be changed by adding a coefficien into the F-measure. Further discussion of this is outside the scope of this brief exercise except to say that F1 (above) has beta=1 and places equal weight on precision and recall. F2 (beta=2) places less weight on precision and more on recall, and F0.5 emphasizes precision over\u00a0recal Other metrics to consider include Precision Recall curves and the Receiver Operating Characteri curve (ROC). ROC curves are more informativ when evaluating datasets with equal proportion of positive and negative results. In a skewed dataset such as this, ROC can be overly optimistic and curves are more\u00a0relia Therefore after briefly demonstrat the confusion matrix we consider the average precision and recall instead of comparing the area under the ROC curve for Summary of results with the 3 most significan features\u00b6 In\u00a0[27]: from sklearn import metrics y_pred),3) y_pred),3) print(f\"F1 y_pred),3) Precision: 0.789 Recall: 0.91 F1: 0.845 Confusion Matrix\u00b6The confusion matrix simply shows the number of True Positives, True Negatives, False Positives and False Negatives that the These results are based on the testing data set. This model was trained on the In\u00a0[28]: import numpy as np import as plt import seaborn as sns %matplotli inline In\u00a0[29]: # name of classes fig, ax = plt.subplo tick_marks = class_name class_name cnf_matrix = y_pred) annot=True cmap=\"YlGn ,fmt='g') matrix', y=1.1); label'); label'); both precision and recall is useful in cases where there is an imbalance in the observatio between the two classes. This is because the large number of class 1 (\u201cgood\u201d) examples means we are less interested in the skill of the model at predicting this correctly, e.g. high true positives, and instead are more concerned with how the model can predict true negatives (where credit class is \u201cbad\u201d). The confusion matrix above shows that the largest error group is False Positives, which is expected because the model has relatively few examples of negative examples compared to positive. This is why the precision is lower than the\u00a0recall A no-skill classifier is one that cannot discrimina between the classes and would predict a random or constant class in all cases. This creates a minimum precision called the \u201cno-skill\u201d line. Its value is the ratio of positive cases in the dataset. For our training data set it is 0.687. (Note it is not 0.7, due to the random selection of the training\u00a0s The Precision Recall metric is a useful measure of success of prediction when the classes are very imbalanced as they in this case. A curve focuses on the performanc of a classifier on the The curve shows the tradeoff between precision and recall for different probabilit thresholds which we refer to but do not investigat in detail in this The threshold is the probabilit at which a \u201cgood\u201d or \u201cbad\u201d label is applied. The default threshold is 0.5, which means that if the model calculates the probabilit of an observatio being \u201cgood\u201d as more than 0.5 then it applies the label \u201cgood\u201d, otherwise it applies the label \u201cbad\u201d. This threshold probabilit can be varied and the impact on model A high area under the curve represents both high recall and high precision. High scores for both show that the classifier is returning accurate results (high precision) as well as returning a majority of all positive results (high\u00a0reca A system with high recall but low precision returns many positive results, but most of its predicted labels are incorrect when compared to the A system with high precision but low recall is the opposite, returning very few positive results but most of them are correct. An ideal system with high precision and high recall will identify almost all positive cases, with all results AP summarizes a curve as the weighted mean of precisions achieved at each threshold, with the increase in recall from the previous threshold used as the\u00a0weight In\u00a0[30]: from import disp = X_test, y_test) no_skill = / len(y_test # the proportion of results that are \"good\" 1], [no_skill, no_skill], label='No Skill'); Summary\u00b6Us the 3 most signficant features we have the F1: 0.845 AUC:\u00a00.87 The model could be improved by finding the optimal number of features to\u00a0conside Improving the model by optimizing the number of features\u00b6T code below iterates through the available features from most significan to least, adding an additional factor in each\u00a0loop. Each loop generates a new logistic regression model and trains it on the same training data set. Its performanc is evalulated using the same validation data set. The results are summarized in the \u201cresults\u201d dataframe and Due to the very small amount of data, changing the random_sta value can produce results with different trends. Therefore the first recommenda to improve the model is to get more data. This is also In\u00a0[31]: %%capture X = # start with all the features max_featur = 10 i = 1 results = [] while i <= max_featur print(i) X = # all columns except the \"class\" column # select the features we want to use in the model cols = [j[0] for j in # is sorted descending # remake the training sets with the correct features X_train, X_test, y_train, y_test = y, # random state is specified, ensuring same groups for each test X_train, X_validati y_train, y_validati = y_train, logreg = y_train); y_pred = # use validation this time (not test) because we will compare different models disp = X_validati y_validati fscore = (2 * disp.preci * disp.recal / + disp.recal lr_f1 = y_pred) i, 'f1': lr_f1, 'AUC': i += 1 results = inplace=Tr In\u00a0[32]: results Out[32]: .dataframe tbody tr { middle; } .dataframe tbody tr th { top; } .dataframe thead th { text-align right; } f1 AUC i 1 0.852590 0.740326 2 0.850202 0.810601 3 0.859438 0.855195 4 0.832653 0.826140 5 0.825911 0.873120 6 0.825911 0.874651 7 0.811475 0.876348 8 0.832653 0.878845 9 0.826446 0.889659 10 0.825000 0.881342 Optimizati Results\u00b6Th results show that the highest F1 score (which places equal emphasis on Precision and Recall) of 0.859 is obtained using a logistic regression model that uses the three most significan features (Credit Amount, Checking Status and\u00a0Durati Due to the small data sets (700 examples in the training set, 150 in each of the validation and testing sets) the results can vary significan when a different random number seed is used to generate the 3\u00a0datasets Whilst the table above shows that AUC increases with number of features, the increase is small (0.9% from 5 to 10 features) and sensitive to the random number seeds - if a different seed is used, the same trend is not observed. Therefore, whilst curves are a useful tool in imbalanced datasets, the small size of the dataset being used in this analysis creates variation that wouldnt be observed in Model performanc using Test Data Set\u00b6We test the performanc of the model chosen in the optimizati stage. In this stage we use the testing dataset, because this provides a more reliable indication of how the model might perform with (new) production data compared to using the results from the In\u00a0[33]: cols = ['duration X_train, X_test, y_train, y_test = y, # random state is specified, ensuring same groups for each test X_train, X_validati y_train, y_validati = y_train, logreg = y_train) y_pred = # use validation this time (not test) because we will compare different models disp = X_test, y_test) #fscore = (2 * disp.preci * disp.recal / + disp.recal #fscore y_pred) Out[33]: Out[33]: Obtain more training\u00a0d This will reduce the influence of randomness on the model\u00a0scor Use a credit score instead of just a \u201cgood\u201d or The \u201cgood\u201d or \u201cbad\u201d status is probably the result of a threshold being applied to a metric. Accessing the underlying metric would provide more detail and help train a more Consider This would require domain expertise and if { var mathjaxscr = = = = ? \"innerHTML : \"text\")] = + \" config: + \" TeX: { extensions { autoNumber 'AMS' } },\" + \" jax: + \" extensions + \" displayAli 'center',\" + \" displayInd '0em',\" + \" showMathMe true,\" + \" tex2jax: { \" + \" inlineMath [ ['$','$'] ], \" + \" displayMat [ ['$$','$$' ],\" + \" true,\" + \" preview: 'TeX',\" + \" }, \" + \" 'HTML-CSS' { \" + \" linebreaks { automatic: true, width: '95% container' }, \" + \" styles: { .MathJax .mo, .MathJax .mi, .MathJax .mn': {color: 'black ! important' }\" + \" } \" + \"}); \"; (document. || }"},{"title":"Grep only inside particular\u00a0files","category":"snippet","url":"grep-particular-files-only.html","date":"18 May 2021","tags":"grep, linux ","body":"grep -inr --include package.js \\ 'shortcut\" {' . -A 3 It\u2019s the --include flag that does the -i \u2192 -n \u2192 print line\u00a0numbe -r \u2192 recursive from starting\u00a0p . \u2192 start in -A 3 \u2192 print the 3 lines below the found\u00a0line"},{"title":"View a List of Keyboard Mappings in\u00a0Vim","category":"snippet","url":"vim-debug-mapping.html","date":"18 May 2021","tags":"vim ","body":":map \u2192 show a list of the current keyboard mappings for normal, visual, select and operator pending\u00a0mo :map! \u2192 show a list of the current keyboard mappings for insert and Top put all the mappings into a convenient text\u00a0file: :redir! > vim_maps.t :map :map! :redir END source another\u00a0so"},{"title":"Note Taking and Knowledge\u00a0Systems","category":"snippet","url":"note-taking-and-knowledge-systems.html","date":"18 May 2021","tags":"zettelkasten, notes, knowledge-system ","body":"This blog post arrives at the conclusion that the only way to take good paper notes is to organise them and only summarise the content once the notebook is\u00a0full. Keep it simple, trust simple \u2192 robust \u2192 reliable Also, it links to"},{"title":"Building A Story\u00a0Brand","category":"Non-technical/Learning","url":"building-a-story-brand.html","date":"17 May 2021","tags":"reading, marketing, communication, book ","body":"\u201cBuilding a story brand\u201d by Donald Miller is one of the most helpful marketing books I\u2019ve found. It explains why things should be done, as well as how to do\u00a0them. As an engineer I love solutions to problems that start from first principles and this feels like that. It\u2019s strategy as well as\u00a0tactics These are my notes on an Table of Contents Premise Stories Hero Problem Guide Recap Plan A Call To\u00a0Action What is at\u00a0stake? Avoid\u00a0Fail End with\u00a0Succe People want your brand to participat in Implementa Websites Corporatio Marketing Road-map One-liner Create a lead\u00a0gener An automated email\u00a0camp Collect and tell stories Create a system to Premise People want to survive and thrive. They are the main character in the movie of their own life and they have problems. Life is busy and hard and complicate and they don\u2019t want to waste energy figuring out if you can help\u00a0them. Good marketing shows that the product has an obvious benefit for the customer. How does your product improve someones ability to \u201csurvive and\u00a0thrive Cut through noisy and challengin world by having a super quick and easy to understand message. Don\u2019t be clever, be clear. Don\u2019t make the customer spend calories trying to figure out how they will benefit from your\u00a0produ Stories are intuitive and leverage many psychologi features. They organise informatio in an intuitive way and are a great way to combat noise and gain attention. People are compelled to pay attention until all the \u201cstory gaps\u201d have been closed. Could a caveman look at your website and answer What do you offer? How will it improve my\u00a0life? What are the next steps? Story\u00a0Gaps A gap between a character and what they want. Will they find their way to success, overcome their Cadence and momentum are defined by the creation and fulfillmen of story\u00a0gaps If you fail to define something that the customer wants, you fail to open a story\u00a0gap. This makes the story uninterest because there isn\u2019t a question that requires an\u00a0answer. Story gaps work because we want things to resolve. It\u2019s like singing Twinkle Twinkle Little Star but stopping before the \u201care\u201d on the last line. You need to hear the last note of the melody for the tune to feel\u00a0compl Pare down the customers desire to a single focus. Make your brand known for single specific desire and helping people get\u00a0it. Don\u2019t clutter the story by diluting the hero\u2019s desire with other desires. You can eventually Apple When Apple released the Lisa computer in 1983, Jobs bought a 9 page ad in the New York times listing the computers features. When Jobs returned to Apple after being fired, (and after partly founding Pixar, which tells stories) Apple became a customer centric company, and their marketing was about the customers. 9 pages became 2 words - \u201cThink Different\u201d A message about their customers, and their customers need to survive and\u00a0thrive Apple isn\u2019t the hero in the \u201cThink Different\u201d brand, the customer is. It\u2019s the same with Nike - the athletes are the heros, and you can become a (hero) athlete with Nike\u2019s help. Apple plays a role more like Q in James Bond, giving the hero what they need to\u00a0win. Stories A Hero \u2192 has a Problem \u2192 and meets a Guide \u2192 who gives them a Plan \u2192 and calls them to Action \u2192 that ends in Success \u2192 and helps them avoid Failure. Hero The customer, not you. You are the guide. The customer is the main character in their life and is battling internal and external adversity it order to survive and thrive. Heroes have weaknesses the guide is usually the A story starts with a character who wants to overcome external challenges that pose internal and Your first task is to create a story gap that implicitly asks \u201cWill the hero get what they\u00a0need? You need to define the character\u2019 ambition at the beginning, so that the audience knows what\u2019s at stake and what kind of a story it is. This messaging implicitly tells them how they could benefit and why they should Problem Customers are more motivated to solve internal problems than\u00a0exter If you identify a customer\u2019s problem then you show that you understand them. This is a great hook because they will then relate to The more we talk about the problems our customers are experienci the more interest they will have in our brand. Every story needs a villain, which is the personifie problem. The germs are personifie envy is personifie etc. The diminished social status is personifie by someone with more status The villain should be relatable - readers should recognize that the villain should be\u00a0disdain should be a root source - frustratio is what the villain makes us feel. High taxes are the cause, and are therefore the\u00a0villai should be singular - just one villain, keep it simple to cut through the\u00a0noise. should be real - don\u2019t be a fear monger. Fight a real problem on behalf of There are three types of problem External - A barrier to stability that must be removed - My business isn\u2019t growing fast enough, profits are too\u00a0small. Internal - I have self-doubt Do I have what it takes to\u00a0succeed Philosophi - I deserve to be successful my hard work should be rewarded. Failure would be\u00a0unjust. The villain initiates an external problem that causes the hero to encounter an internal problem that is wrong or unjust. The purpose of the external problem is to manifest the internal problem. Customers should recognise and relate to both types of\u00a0problem Put your product in the context of a type of survival they they want. Otherwise there isn\u2019t a story gap. Translate the external problem into several Resources - Conserving or accumulati money or\u00a0time Social - Gaining status or a social\u00a0net Generosity - Most people are not as Darwinian as we\u2019ve been led to believe, they want to be empathetic and\u00a0caring Purpose - Give customers an opportunit to be generous and participat in something greater than themselves - \u201cthe chief desire of man is not pleasure, but\u00a0meanin The only reason that customers buy is because the external problem that your product would solve is causing an internal frustratio If you identify and articulate that frustratio and then clearly, confidentl and repeatedly offer to solve it along with the original external problem then you bond with your customer. You\u2019ll have positioned yourself more deeply into their internal narrative and substantia differenti your\u00a0brand Philosophi problems are important because people want to be involved in something larger than themselves It adds depth and meaning. Representi (and solving) a philosophi problem gives customers a way of expressing themselves that they wouldn\u2019t If you can resolve all three problems in the same transactio then customers will experience a wave of relief and pleasure, and love your\u00a0brand When Anakin Skywalker blows up the Deathstar by aiming the perfect shot, he defeats the external problem (the enemy army), his internal problem (self doubt) and the philosophi Your CTA is the action that must be taken to close the final story gap Checklist: Is there a single villain that your brand What external problem is that The internal problem is probably found by considerin how the external problem makes your customer\u00a0f What is unjust or wrong about the suffering caused by that\u00a0villa Our hero is being challenged - will they be able to solve their problem? The only way to find out is to engage with the brand. Guide Customers aren\u2019t looking for another hero, they are the hero in their life. They are the main character which the movie of their life revolves around. They\u2019re looking for a guide because they need\u00a0help. A persons life is made up of many acts - \u201cdoorways of no return\u201d. Each life is unique but we all have commonalit We are all on journeys with Story chapters are book-ended by events. These events are always instigated by external actors or events beyond their\u00a0cont Heroes need a guide who is trustworth and earns respect. If they didn\u2019t need a guide their wouldn\u2019t be narrative, or a problem. Everyone is looking for a guide to help them solve We wake up each morning as a hero. We are troubled by internal, external and philosophi issues. We know that we can\u2019t solve our issues on our own. This insight has consequenc and raises questions, it means our story isn\u2019t about us but about others. We can aspire to be someone else\u2019s guide, but not their hero. It shows why the search for meaning is innate, but can only be resolved by becoming a servant or\u00a0guide. Stop losing sleep over the success of your company and start losing sleep over the success of In stories, heroes are not the strongest characters they have self-doubt and are often ill-equipp They are often reluctant, and are thrown into the story by external events. They are \u201cchosen by\u00a0destiny In contrast, the guide has already \u201cbeen there and done that.\u201d They have already conquered the hero\u2019s internal and philosophi challenges in their own\u00a0backst The guide is the one with authority which the authority instinctiv recognises and accepts. The guide has much more authority than the hero but the main character is still the\u00a0hero. Those who realize that the epic story of life is not about them end up winning in the end. This is paradoxica Those who think they are a hero and win usually end up being remembered as a\u00a0villain. of a\u00a0guide Empathy and Authority are a precise one-two\u00a0pu 1. Empathy \u2192 Understand \u2192\u00a0Trust When we empathise with customers we show that we understand them, and understand People want to be seen, heard and understood This is the essence of\u00a0empathy Key phrases in your marketing copy could\u00a0be: we understand how it feels to \u2026 no-one should have to experience \u2026 like you, we are frustrated by\u00a0\u2026 Expressing empathy isn\u2019t difficult. Once you\u2019ve identified your customers internal problem, let them know that you understand and would like to help them find a\u00a0resoluti Brains like to conserve calories, energy, effort, time, so when a customer realises they have a lot in common with a brand, they fill in any gaps with trust. A customer will \u201cbatch\u201d their thoughts, which means they are thinking in chunks rather than in details. Commonalit whether in music taste or values, is a powerful 2.\u00a0Authori No one likes a know-it-al and no one wants to be preached at\u2026 But people do want you to establish competence When looking for a guide, a hero trusts someone who has demonstrab competence The guide should have some serious experience helping other heroes win their day, but doesn\u2019t need to be\u00a0perfect There are four ways to add authority (competenc to your marketing without Testimonia Logos Statistics Awards Meeting a brand is like meeting a\u00a0person, Can you help them live a better life? Can they associate their identity with your\u00a0brand Can I respect this\u00a0brand Can I trust this\u00a0brand Recap We started the narrative by identifyin something that the hero\u00a0wants Then we created intrigue and tension by defining the hero\u2019s problem. The audience wants to know if we can help them overcome the\u00a0proble Then we introduced ourselves as the guide and establishe authority, empathy and\u00a0trust. What next? Making a purchase always involves a small risk of wasting money. This risky element makes a purchase somewhat similar to starting a relationsh There is a potential downside, the customer might end up feeling foolish and regret Imagine a customer trying to cross a river to get to their purchase. They can hear the sound of a waterfall downstream If they try to cross it by making a purchase then there is a chance that something bad could happen. Put stones in the river so they know how to safely walk across, step by\u00a0step. The stones are the\u00a0plan. Plan In a movie, the guide gives the hero a plan. The plan tightens the focus of the movie and creates a \u201cpath of hope\u201d for the hero that might, possibly, lead to the resolution of the hero\u2019s problems. It creates a story gap and implicitly creates questions that the audience want to be\u00a0answere A good plan removes risk and explains what to do. If we don\u2019t guide customers, they experience a little bit of confusion and use that confusion as an excuse to not\u00a0purcha Even though the setup or purchase or after-purc steps are obvious to us, they are not obvious to customers. Give them a plan and they will feel more confident Heroes trust a guide who has a plan. People are looking for a philosophy then can embody, or a series of steps they can take to solve Customers want to know where you can take them. Unless you can take them somewhere they want to go, why would they listen? The marketing goal is that every potential customer knows where we want to take them. Define a desire for your customer, and your marketing story will have a powerful\u00a0h There are two kinds of plan. Both work by earning trust and offering the customer a clear path to\u00a0stabili Agreement\u00a0 Process\u00a0Pl Process\u00a0Pl The minimal (3 - 6) steps required to buy or get benefit from the product after purchase, or a mixture of both.\u00a0E.g. Make Allow us to create a Let\u2019s execute the plan\u00a0toget A process plan removes confusion from the customers journey. When they see the plan they think \u201coh that\u2019s not difficult, I can do that\u201d and then they\u00a0purch A post-purch process plan would alleviate confusion about how the customer would use the\u00a0produc Agreement\u00a0 Agreement plans are about alleviatin fears. It\u2019s a list of agreements you make with the customer that are designed to alleviate their fears of doing business with\u00a0you. An agreement plan can also work to highlight shared values. Give the agreement plan a good name and it can increase the perceived value of your product. \u201cthe plan\u201d, \u201cyour best nights sleep ever\u201d,\u00a0etc Agreement plans can work in the background they don\u2019t have to be on the landing page, though they could\u00a0be. Make an agreement plan by creating a list of all the things a customer could be fearful about when doing business with you (haggling for price, interactin with a pushy salesman, buying a defective product) and then create an promise that would nullify that\u00a0fear. A Call To\u00a0Action So far, we\u2019ve defined a desire, identified their challenges empathized with them, establishe our competency and given them a plan. Heroes only take action when challenged by an external force. They don\u2019t take action by themselves they must be challenged This is just how humans\u00a0are We are the external force that guides our customers to\u00a0success Heroes need to be challenged by external forces. Calls to action should be clear, and should be repeated over and over. Above the fold, in the center of the page. Also in the navbar. And also repeatedly as they scroll down the\u00a0page. Customers are bombarded with adverts and all day every day. They are ignoring things and filtering out noise all the time. So don\u2019t be shy or subtle. Be very clear. Make it very simple. If you have confidence in your product, make confident calls to\u00a0action. Direct\u00a0Cal Buy\u00a0Now Schedule an\u00a0Appoint Order\u00a0Now Call\u00a0Today Register\u00a0T Repeat the same (singular, simple) call to action again and again down the\u00a0page. Download our free PDF guide to growing your\u00a0busin Free informatio - advice, guides, Giving something away for\u00a0free Testimonia Samples Free-trial A good transition CTA does three things: Changes the customers perception of you - it establishe your expertise and\u00a0author Creates reciprocit you offer them something of value before you ask for their\u00a0mone Positions you as a guide for the next\u00a0steps Use both types of CTA (direct and transition in your messaging. Then customers will understand (simply and without burning calories) what you want them to do in order to solve What is at\u00a0stake? Stories live and die by a single question; \u201cWhat is at stake?\u201d If nothing is about to be gained or lost then nobody cares. If there is nothing at stake in the story then there is no story, it\u2019s just If there is no benefit to buying the product, then why buy\u00a0it? You have to show the customer the cost of not buying the product. Avoid\u00a0Fail The story remains interestin as long as the hero is teetering on the edge between success and failure. A hero in a story only ever has 2 motivation - to escape pain or experience something good. Life is like that too, our desire to avoid pain motivates us to make a\u00a0change. There needs to be meaningful and consequent stakes in the story, otherwise it isn\u2019t interestin Each scene needs to move the hero either closer to or further from their tragic A brand needs to answer the \u201cwhat if I don\u2019t buy\u201d question otherwise the customer can\u2019t answer the \u201cso what?\u201d question - the stakes need to be clearly, simply, concisely communicat Probably stake, not\u00a0stakes You can do it humorously or lightly. Don\u2019t make a big negative scary thing out of it - failure is salt to add flavor, not a main marketing ingredient Compare the fear to the peace and stability that could be achieved. If you show the pain \u201cbefore\u201d and contrast it to peace and stability \u201cafter\u201d then you\u2019ve opened and closed a story\u00a0loop Blog titles, email subjects, headlines, can all contain elements of potential failure to convey a sense of urgency. Bring up the negative stakes a\u00a0bit. People fear losing $100 more than they desire gaining $100 - loss and pain is more motivating than reward and\u00a0peace. Recipe Let the reader know they are vulnerable to a threat. \u201cX% of Y get\u00a0W\u201d Let the reader know that since they are vulnerable they should take action to reduce their vulnerabil \u201cMake sure this doesnt happen and Give the customer a clear, concise simple plan to reduce their vulnerabil \u201cWe offer the thing you\u00a0need\u201d Challenge people to take the next step right now. The CTA. \u201cCall us today to arrange Agitate a bit of fear, and then return the reader to peace and prosperity all within 1 or 2\u00a0paragrap What are you helping your customers avoid? It\u2019s only a little salt to add flavor. If it\u2019s too little, customers won\u2019t know why your product is important. If it\u2019s too strong it\u2019ll End with\u00a0Succe Humans are looking for resolution to their external, internal and philosophi problems. They can achieve this though status, and transcende (among other things). If you product or service can help people achieve this then it should be a central part of your brand\u00a0prom People want to change their lives and be taken into a new reality. Tell them how their lives are going to improve - peace, status, confidence People want a vision of a happy ending. Compare these two statements \u201cWe\u2019re going to put a man on the moon\u201d or \u201cWe would like a highly competitiv and successful space\u00a0prog Use this table to show how your customer\u2019s lives will change. It will give lots of good copy for Before your brand After your brand What do they have? What are they feeling? What\u2019s an average day like What is their status? Talk about the end vision really clearly. And use images of happy, successful powerful people enjoying the benefits of your product. Say the benefits loudly, confidentl clearly, Talk about your end-vision for their lives once they\u2019ve benefited from your\u00a0servi Show the customer a vision of how great their life will be if they do business with\u00a0you. Ultimately the end of the story should be a list of resolution to your customers problems. How do they feel and how have questions been resolved? Stories usually end in one of three\u00a0ways The hero gains status or power Offer access - get a free\u00a0coffe Offer a premium - skip the\u00a0line. Create scarcity - write \u201climited\u201d on\u00a0it. Identity associatio - wear a Rolex and be associated with what Rolex stands\u00a0for The hero becomes whole by being unified with someone The hero needed something they couldn\u2019t get themselves and external provision has saved\u00a0them Reduced anxiety, or more\u00a0secur Reduces effort, or More time, reaches an The hero has some internal realisatio (coming of age) that gives them confidence to overcome their circumstan or internal shortcomin and become \u201cwhole\u201d and wise. They can achieve inner peace and know they reached their potential. Inspiratio - chariots of fire - you can also run really\u00a0fas Acceptance - fashion brands doing positive body\u00a0image Transcende - greater purpose and meaning - People want your brand to participat in A hero needs someone else to step into their life, tell them they are different and special and better. That someone is a guide, that\u2019s\u00a0you Offering an aspiration identity to our customers adds a lot of value to our products and services. Realise that your customers want to transform and People are looking for a guide. Everybody wants to change either into someone better or someone more You are helping them become wiser, fitter, more equipped, accepted or\u00a0peacefu What does the customer want to\u00a0become? what is their what kind of person do they want to\u00a0be? How do they want their friends to talk about\u00a0them Being a guide is a position of the heart, not just a marketing tactic. Lose sleep over your customers problems instead of your business. Commit to solving their internal, external and philosophi problems, and give them a vision to aspire\u00a0to. Customers needs to be told very clearly how much other customers have changed and how far journey has taken them. Usually, the hero is deeply flawed right up until the final Not all elements of a story should be used\u00a0evenl Implementa Websites Above the fold, use the one-liner. It\u2019s one sentence saying what\u2019s in it for the customer. What problem do you solve, what aspiration identity do you offer. It will also give customers words they can use to tell others about your\u00a0busin Big obvious calls to action, in multiple places. Don\u2019t be\u00a0timid. Customers read in a Z\u00a0shape. People don\u2019t read websites, they scan them. Repeat the important things so that they are understood by quick readers. Use very few words. The fewer words you use, the more likely they will be\u00a0read. Perhaps 10 sentences on the entire landing\u00a0pa Increase the amount of text towards the bottom. The top needs to be short, fast, positive and\u00a0obviou Use a \u201cread more\u201d button to expand longer text, then customers have the option if they\u00a0want. Place a transition call to action next to the main call to action. \u201cDownload a guide\u201d next to \u201cbuy now\u201d. Put the transition CTA in less bright\u00a0col Repeatedly ask people to buy, twice above the fold and then in subsequent sections. If you don\u2019t tell people what to do then they won\u2019t do\u00a0it. Include images of success - people enjoying the benefits - resolved problems, aspiration achieved, closed story loops. A sense of health, well-being The flow of the landing page should follow the StoryBrand framework, albeit not\u00a0exactl Corporatio Not applicable but if you\u2019re a manager and you give your team a narrative they can fit themselves into I guess they\u2019re going to have more purpose, confidence and\u00a0meanin Five steps, Create a one-liner. Say how can you make their lives better in an engaging\u00a0w Create a lead generator and get qualified email addresses. A guide, a free 30 minute consultati a voucher,\u00a0e An automated email campaign - a weekly email, 3 with nurturing advice and then 1 with a CTA. All\u00a0automa Collect and tell stories of Almost all stories are about the transforma of the hero. Tell this story and people will understand what you are\u00a0offeri Create a system that generates referrals. If word of mouth is so powerful, 1.\u00a0One-lin The one-liner is a statement. It could be more than 1 sentence, but its supposed to be super short. It\u2019s the equivalent of a logline for a movie. Keep editing it until you find a version that\u00a0works Memorize it, put it on your website, and include it in every piece of marketing collateral you\u00a0create You could tell this to anybody and they would understand what you\u00a0do. Helps people realize they need your\u00a0servi Provoke imaginatio and\u00a0intrig It should include: A character (\u201cA busy mom\u201d, \u201cA\u00a0retiree A problem - don\u2019t miss an opportunit to talk about a customers challenges Define the problem as vitally important, it opens a story gap and customers will want you to close\u00a0it. A plan - hint at it. You can\u2019t explain it all in a\u00a0one-line Success - paint an image of life after a customer has bought your\u00a0servi Example answers to the question \u201cWhat do you\u00a0do?\u201d: \u201cWe provide busy moms with a short, meaningful workout they can use to stay healthy and have renewed energy\u201d vs \u201cI run a\u00a0gym\u201d. \u201cWe help retired couples who want to escape the cold avoid the hassle of a second mortgage while still enjoying the warm beautiful weather of Florida during the winter\u201d vs \u201cI got involved in real estate a few years ago and when we had out second kid we moved to Florida and\u00a0then\u2026. 2. Create a lead\u00a0gener A guide to\u00a0downloa A free webinar or online\u00a0cou Software demo or free\u00a0trial Free\u00a0sampl Live\u00a0event No need to reinvent the wheel, what are others\u00a0doi You can give away quite a lot of value that is easy to consume in an email or PDF - people will consume it quickly and will probably be happy to pay for a chance to learn in a more thorough and Downloadab guides etc should be about 3 pages in\u00a0length. Be generous, explain the \u201cwhy\u201d and give away as much of the \u201chow\u201d as\u00a0possibl 3. An automated email\u00a0camp Even if customers don\u2019t click the links in the email, they keep seeing your brand and they become familiar with it. When they need your services you will be the go-to brand in their mind. The relationsh and associatio is already\u00a0bu Nurturing emails and Call-To-Ac emails - 3:1 - put real value in the Don\u2019t be passive in the CTA emails - you want them to buy, so tell them, repeatedly and make it\u00a0easy. Talk about a\u00a0problem Describe a product you offer that solves the\u00a0proble Describe what life can look once the customer has bought your product and solved their\u00a0prob Call the customer to an action that leads directly to a\u00a0sale. Similar to a nurturing email, a direct action email also describes a problem and a solution, but in the direct email the solution is a product you sell and there is a strong call to action. A lot of the content can be taken from the 4. Collect and tell stories A great testimonia gives customers the gift of going second, and lowering the riskiness of\u00a0purchas A good showcase the value you\u00a0create the\u00a0result the\u00a0experi what transforma has\u00a0occurr Few things are more important to a good story than a hero that experience an external and internal and philosophi This is because everyone desires to be transforme in\u00a0someway People love businesses that help them transform in some\u00a0way. Questions to get a What was the problem you were having before you discovered our\u00a0produc What did the frustratio feel like as you experience that\u00a0probl What was different about out\u00a0produc Take us to the moment when you realized our product was actually working to solve your\u00a0probl Tell us what life looks like now that your problem is solved or being\u00a0solv 5. Create a system to Identify your existing Give them a reason to talk about you - create a video or a PDF that a customer can send to their friends to help them introduce your brand and explain the value you\u00a0delive Offer a reward - a discount, more access, extra samples,\u00a0e"},{"title":"Training \u2192 Knowledge \u2192 Confidence \u2192\u00a0Victory","category":"snippet","url":"trainging-knowledge-confidence-victory.html","date":"14 May 2021","tags":"quote, caesar, wisdom ","body":"\u201cWithout training, they lacked knowledge. Without knowledge, they lacked confidence Without confidence they - Julius\u00a0Cae"},{"title":"Forward\u00a0Email","category":"snippet","url":"email-forwarding.html","date":"14 May 2021","tags":"email ","body":"I know about ImprovMX, which used to be great because you could do a lot for free, but now you only get 1 domain for\u00a0free. ForwardEma are 3 times cheaper than ImprovMX, and I have 2 domains forwarding email. It\u2019s not particular private, but I can send and recieve from a domain, for\u00a0free."},{"title":"Nested Auto Commands for Overriding\u00a0Colorschemes","category":"snippet","url":"modifying-vim-colorschemes-correctly.html","date":"13 May 2021","tags":"vim ","body":"A snippet detailing how to use nested auto commands to apply custom modificati when a colorschem is loaded. Perhaps this will stop me :e-ing so\u00a0frequen"},{"title":"Vim\u00a0Snippets","category":"snippet","url":"vim-snippets.html","date":"13 May 2021","tags":"vim, netrw ","body":"A useful collection of gists by A gist about netrw."},{"title":"See where Vim is setting an\u00a0option","category":"snippet","url":"where-was-a-setting-set-.html","date":"12 May 2021","tags":"vim ","body":"See where an option was set in vim using the :verbose set textwidth?"},{"title":"My Life\u00a0Expectancy","category":"Non-technical/Journal","url":"my-life-expectancy.html","date":"12 May 2021","tags":"regression, statistics, life-expectancy, death ","body":"Key points: Weight doesn\u2019t matter. Be in \u201cexcellent health. Workout 3 or 4 times a week. Drink liquor (or red wine) 3 or 4 times a week (after each workout). Try to be happy, optimistic and relaxed. What are you working so hard for anyway? You need something to do, someone to love, something to hope for. Longevity Calculatio I was playing with the life expectancy calculator at and was surprised to find that their regression technique gives me a 50% chance of living to 95! I\u2019d expected a result closer to 80. After printing out a 90 year calendar, my next thought was to play with the calculator to find the maximum age I could have a 50% chance of reaching. Doing the factors below would apparently give me: 75% chance of living to 92 50% change of living to 101 25% chance of living to 107 Don\u2019t workout every day There is no additional benefit from working out more than 4 times a week. Once you\u2019re fit and working out 4 times a week it doesn\u2019t matter if you do more exercise. Being in \u201cexcellent health is better than being in \u201cvery good\u201d health though, so make the workouts count. I guess if it didn\u2019t make a difference then it wouldn\u2019t be excellent in the first place. Drink more alcohol I was surprised that having 3-4 drinks each week increases your life expectancy I thought that it was best to not drink any alcohol at all. I should be drinking red wine or liquor 3 or 4 times each week. Liquor increases longevity in men, but reduces longevity for women. Red wine increases longevity in women, but has no effect in men2. I guess I should drink some rum after each workout. Weight doesn\u2019t make a different According to their regression my expected longevity is unchanged within a weight range of 78kg - 90kg. I guess it\u2019s much more important that I\u2019m in \u201cexcellent health, working out 4 times a week and having some wine, rum, or whiskey after each workout. But it\u2019s just statistics The calculator only asks for quantifiab or physical attributes It doesn\u2019t consider emotional, relational or spiritual factors.Th is also a more detailed calculator at its calibrated for Canadian citizens. \u21a9Alcohol consumptio in later life and reaching longevity: the Netherland Cohort Study \u21a9"},{"title":"Globbing","category":"snippet","url":"globbing.html","date":"10 May 2021","tags":"linux ","body":"???? \u2192 4\u00a0chars * \u2192 any number of\u00a0chars [:upper:] \u21d4 [A-Z] same for [:lower:] and [:digit:] [:alpha:] \u21d4 [a-zA-Z] [:alnum:] \u21d4 [a-zA-Z0-9 ls -l [a-d] \u2192 part of a\u00a0range ^ and $ works like in\u00a0regex la a*.{doc,do \u2192 OR ls a*.(doc|do \u2192 OR"},{"title":"More VIM\u00a0Notes","category":"Technical/Developer Tools","url":"more-vim-notes.html","date":"10 May 2021","tags":"tips, vim ","body":"Table of Contents Quickfix\u00a0l A quickfix list is a set of positions in one or more\u00a0files A quckfix list is global. Not local to a A quickfix list is not the quickfix window. The window can show the list. The list is A changelist is local to its\u00a0buffer Registers 0 contains the content of the last\u00a0yank 1-9 contains the content you\u2019ve deleted or\u00a0changed _ blackhole register - send something here and it wont change any - contains any deleted or changed content smaller than 1\u00a0row. % contains the name of the current\u00a0fi In insert mode, =. Type any The output is inserted into the\u00a0buffer Substituti :&& \u2192 repeat the last substituti command with its\u00a0flags :~ \u2192 repeat the lat substituti with the same replacemen but with the last used search\u00a0pat Command\u00a0li q: - opens the command line window. Good for yanking and viewing : - open command history\u00a0li :UltiSnips - opens the ultiSnips file for the current buffers filetype. See which snippets are\u00a0define Delete stuff without leaving insert\u00a0mod - same as\u00a0backspa - delete previous\u00a0w - delete everything before cursor (on same\u00a0row) or - (un)indent a\u00a0row - delete next\u00a0word Text\u00a0objec gf - edit the file at the file path under the cursor (useful for\u00a0netrw? gx - open the file at the file path under the cursor (useful for\u00a0netrw? [m, ]m - move to the start or end of a\u00a0method @: - repeat the last\u00a0comma >> will indent a line. . will repeat the operation, so >>.. would indent a line 3\u00a0times. You can use this along with a count, which will do the indention for n number of lines (with the current line being the top line). 3>>.. will indent 3 lines 3 blocks to the\u00a0right. - up one line, and moves the cursor if it would go off the\u00a0screen - down one line, and moves the cursor if it would go off\u00a0screen - down one page, with cursor at top of\u00a0screen - up one page, with cursor at bottom of\u00a0screen Sources The Valuable Dev has a lot of great\u00a0tips Vim for Python has some great notes on linting and code completion plugins that I\u2019ve either copied or was more or less doing\u00a0alre"},{"title":"Vimscript\u00a0functions","category":"snippet","url":"create-custom-functions-in-vim.html","date":"5 May 2021","tags":"vim ","body":"Create a custom command and function to create a new file in\u00a0vim. command! -nargs=1 Ms call function! s:NewFile( echom a:fp execute \"e \" . \"~/foo/bar . a:fp . \".ext\" endfunctio Useful\u00a0hel :h %:h \u2192 :h expand() \u2192 expand wildcards, including question on SO"},{"title":"Better Text\u00a0Objects","category":"snippet","url":"vim-text-objects.html","date":"5 May 2021","tags":"vim ","body":"target more types of\u00a0object consistent if you\u2019re not inside the thing jump forward or\u00a0backwar look for the nth\u00a0occurr select white space Github Article about a\u00a0plugin"},{"title":"Delete stuff in Vim without leaving insert\u00a0mode:","category":"snippet","url":"delete-from-vim-insert-mode.html","date":"5 May 2021","tags":"vim ","body":" - same as\u00a0backspa - delete previous\u00a0w - delete everything before cursor (on same\u00a0row) or - (un)indent a\u00a0row - delete next word (create a mapping in\u00a0vimrc)"},{"title":"Global\u00a0Aliases","category":"snippet","url":"global-aliases.html","date":"5 May 2021","tags":"alias, linux ","body":"If you want to alias a bunch of arguments for a command, use alias -g foo=\"some complicate options\" grep some complicate options becomes: grep foo"},{"title":"Vim register for yanked\u00a0text","category":"snippet","url":"vim-yanked-text-buffer.html","date":"5 May 2021","tags":"vim, linux, text ","body":"It\u2019s annoying when you delete something and overwrite your yanked\u00a0tex Use numbered registers! \"0 to \"9 \"0 contains the most recent yank. \"1 contains the most recent deleted\u00a0te \"0p - paste the most recent yank, even if you deleted something after yanking\u00a0it"},{"title":"ChezMoi\u00a0shortcuts","category":"snippet","url":"chezmoi-shortcuts.html","date":"5 May 2021","tags":"dotfiles, alias ","body":"Chezmoi is a great tool for managing dotfiles. This is a shortcut to update the source state based on local\u00a0chan chezmoi status | cut -c 4- | xargs -I % -p sh -c 'chezmoi add ~/%' Github"},{"title":"Sleep","category":"snippet","url":"sleep-is-good.html","date":"4 May 2021","tags":"sleep, lifestyle ","body":"\u201cIt enhances your memory and makes you more creative. It makes you look more attractive It keeps you slim and lowers food cravings. It protects you from cancer and dementia. It wards off colds and the flu. It lowers your risk of heart attacks and stroke, not to mention diabetes. You\u2019ll even feel happier, less depressed, and less\u00a0anxio Why We Sleep by Dr. Matt\u00a0Walke"},{"title":"Bash Strict\u00a0Mode","category":"snippet","url":"bash-strict-mode.html","date":"4 May 2021","tags":"bash, linux ","body":"How to write robust bash\u00a0scrip Bash Strict\u00a0Mod"},{"title":"How to write an About\u00a0Page","category":"snippet","url":"how-to-write-an-about-page.html","date":"4 May 2021","tags":"writing ","body":"An often recommende blog post by Kaleigh Moore about writing a good about\u00a0page"},{"title":"Domain Name\u00a0Registrars","category":"snippet","url":"domain-name-registrars.html","date":"3 May 2021","tags":"web ","body":""},{"title":"The Honest Troubleshooting Code of\u00a0Conduct","category":"snippet","url":"honest-troubleshooting-code-of-conduct.html","date":"3 May 2021","body":"blog\u00a0post"},{"title":"Linux Filesystem Hierarchy\u00a0Standard","category":"snippet","url":"linux-etsy-dir.html","date":"2 May 2021","tags":"linux, filesystem ","body":"/etc (etsy) \u2192 \u201cetcetera\u201d or \u201ceditable text config\u201d \u2192 a place to put config\u00a0fil Originally the root directory had /boot for booting, /dev for devices\u2026 One dir for each type of thing. But this put config in many places. so etc/ Filesystem fhs-2.3"},{"title":"Browser\u00a0Security","category":"snippet","url":"browser-security.html","date":"30 April 2021","tags":"xss, cors, http ","body":"Blog post about CSRF CORS HTTP"},{"title":"HTML\u00a0Templates","category":"snippet","url":"html-templates.html","date":"30 April 2021","tags":"html, jam ","body":"cruip.com"},{"title":"Linus\u00a0Torvalds","category":"snippet","url":"interview-with-linus-torvalds.html","date":"29 April 2021","tags":"linux, interview, linus ","body":"From an Interview: I don\u2019t want to claim that programmin is an art, because it really is mostly just about \u2018good engineerin I\u2019m a big believer in Thomas Edison\u2019s \u2018one percent inspiratio and ninety-nin percent perspirati mantra. It\u2019s almost all about the little details and the But there is that occasional \u2018inspirati part, that \u2018good taste\u2019 thing that is about more than just solving some problem - solving it cleanly and nicely and yes,"},{"title":"Remote Procedure\u00a0Calls","category":"snippet","url":"rpc.html","date":"29 April 2021","tags":"rpc, linux ","body":"A RPC is when an executable causes a procedure (subroutin to execute on another computer, It\u2019s coded as if it were a normal (local) subroutine You don\u2019t explicitly code the details for the remote interactio You write the same code whether the subroutine is local or\u00a0remote."},{"title":"An interesting\u00a0blog","category":"snippet","url":"useful-blog.html","date":"29 April 2021","tags":"linux, shell, fzf, workflow, zsh, bash, builtin ","body":"Just found a really useful blog Interestin discussion about the difference between builtins Nice examples of using fzf to improve workflows."},{"title":"lsblk","category":"snippet","url":"lsblk-command.html","date":"28 April 2021","tags":"unix, cli ","body":"lsblk is a command to get info about Used when"},{"title":"Ranger File\u00a0Manager","category":"snippet","url":"ranger-file-manager.html","date":"28 April 2021","tags":"ranger, unix, vim, tools ","body":"A console based file manager with vi key\u00a0bindin Install it with brew install ranger Launch it with ranger"},{"title":"Vim\u00a0Regex","category":"snippet","url":"vim-regex.html","date":"28 April 2021","tags":"vim, regex ","body":"This is a great article about using regular expression in\u00a0Vim:"},{"title":"Where and when will the current Bitcoin market\u00a0peak?","category":"Technical/Cryptocurrencies","url":"when-bitcoin-top.html","date":"27 April 2021","tags":"bitcoin, finance, markets ","body":"Buying Checklist Cyclic bottom when price is close-ish to the 200 week moving average Selling Checklist Top Cap \u2248 Market Cap chart MVRV > 4 chart S2F deflection > 3, but noisy chart 0.875 \u00d7 Delta Cap \u2248 Realised Cap chart. HODL waves - 45% moved in the last 6 months chart 12 Month RSI > 90 chart 3-month coin days destroyed - check glassnode, STH and LTH chart Summary S2F model suggests a peak around the beginning of 2021Q4, in the region of $300,000. The rainbow chart seems to broadly agree with S2F. If the age-adjust 3-month coin days destroyed goes above 550,000 then get ready to sell. Willy Woos \u201cdouble top\u201d chart suggests a peak around $400,000. RationalRo comparison of bull runs suggests a market top around 14 September and a maximum price around $2,000,000 [sic]. RationalRo comparison of 12 months RSI suggests that the market top is reached shortly after a 12 month RSI exceeds 90. This seems less reliable than the above points Jurrien Timmer suggests a peak price around $100,000 if price shoots up unexpected quickly. I expect the market top to be significan higher. $100k will be a massive psychologi level. If the price does increase to this level before approximat July then the optimal sell price would therefore not be exactly $100,000. Market top is expected around September 2021 at the earliest. The behaviour of the market will change as its participan change. There hasn\u2019t been a bull run with significan institutio investors before. Sell using a cost averaging strategy. Awareness Buy when everyone is selling, sell when everyone is buying. Be brave when there is fear, if there is no fear then its about to get very messy. If everyone is super confident that prices are definitely going to go up, something bad is about to happen. A lack of uncertaint is a big warning bell. 14 September, $300,000. Threshold Values MVRV MVRV > 3 \u2192 Local top MVRV > 4 \u2192 Macro top MVRV has historical been one of the best on-chain predictors of market tops and bottoms. The ratio of Market Value to Realised Value is calculated by dividing Bitcoin\u2019s market cap by its realised cap. chart Top Cap Top Cap is 35 x Average Cap Market top when Top Cap is equal to Market Cap. Delta Cap Delta cap is Average Cap subtracted from Realised Cap. When Delta Cap is almost Realised Cap, it\u2019s a market top. When Delta cap touches Average Cap, it\u2019s a market bottom. Market Top when Delta cap is within 20%\u00ad15% of Realised Cap S2F Deflection Get key ratio values If s2F deflection > 3, but its noisy HODL Waves >45% of supply has been moved in the last 180 days (6 months) \u2192 Sell >70% of supply has been held over 180 days \u2192 Buy chart - hover the cursor over todays date and add up all the age brackets from 24hr to 3-6 months 12 month RSI > 90 14 month RSI > 95 \u2192 Sell 12 month RSI > 90 \u2192 Sell Noisy - defer to other metrics. SOPR Use 7 day average. 1.04 \u2192 Sell 0.97 \u2192 Buy Noisy - defer to other metrics. Realised Cap > NVT Cap Realised Cap should be lower than NVT Cap. Sell when Realised Cap almost exceeds NVT cap See chart below, there have been false positives. Realised Cap is lower than NVT cap during a bull market only. Noisy, could be a miss. Charts Stock to Flow Model Rainbow Model Top Cap Realised Cap, NVT Delta Cap 3 month coin days destroyed Double top Similariti to previous bull runs Version 1: Version 2: Halving model: 12 Month RSI comparison Bitcoin price history Lowest price forward model Key metrics and terms Average Cap The \u201cforever\u201d moving average of market cap. It is the cumulative sum total of daily market cap values divided by the age of the market in days. Top Cap Average Cap multipled by 35. NVT Cap A valuation using monetray velocity. Checkout CoinMetric for more info. MVRV The ratio of Market Value to Realised Value is calculated by dividing Bitcoin\u2019s market cap by its realised cap. Realised Cap The sum of the products of each UTXO and the market price of Bitcoin when the UTXO was generated. Market Cap The price of the most recent Bitcoin transactio multipled by the number of Bitcoin UTXO Unspent Transactio Outputs. These are kind of like unspent coins. If you have 1.5 BTC then you might have bought 2 and sold 0.5. The total value of UTXOs in your wallet will be 1.5. SOPR The Spent Output Profit Ratio is a measure of the average profit or loss on a coin. Is a coin is moved when the price is higher than when it was received the SOPR increases, if a coin is moved when the price is lower than when the coin was received then SOPR decreases. It won\u2019t be accurate for individual coins but in aggregate it gives an idea of whether coins are being sold at a loss or for profit. Market participan who have owned BTC for 3 months behave differentl to those that have held BTC for 3 years. A more experience investor will likely make more measured and less rash decisions. By segregatin the UTXOs according to age you can compare old coins and new coins, experience and inexperien investors (in aggregate) Weak hands will sell before stronger hands, and when market price decrease it\u2019s useful to know aggregate age data for the coins being sold. If coins are moving from young wallets then the selling is likely much less significan than if coins are being moved onto exchanges from old wallets. aSOPR The Adjusted Spent Output Profit Ratio is the same as SOPR but it ignores coins less than 1 hour old. If profits are taken by old coins, aSOPR trends higher. It will trend lower when older (and therefore profitable coins remain dormant. The higher aSOPR is, the more profit has been taken off the table. When aSOPR is less than 1, spent coints are moved at an aggregate loss. URPD UTXO Realised Price Distributi - If a lot of coins have moved within a particular price band, it is likely that there is strong price support and resistance at this price. This would be truer and more reliable in a mature market. Because the market for Bitcoin is expanding so rapidly and the price is so volatile, the attitudes and expectatio of market participan is also much more malleable than in traditiona finance. For example, what was considered a very high price 12 months ago would be considered a disaster today. RSI The Relative Strength Index is borrowed directly from traditiona finance. You can calculate it over different time periods. Miner Net Position Miner Net Position shows the degree to which, on aggregate, Bitcoin miners are profiting from the coins they\u2019ve generated from mining. Miners are expected to be among the most bullish of all market participan and therefore it is notable when they start moving coins from their mining wallets into exchange wallets. Stock-to-F A stock to flow model is used to measure the scarcity of a commodity. It\u2019s a calculatio based on the ratio of existing supply and how much is being produced. The higher the ratio, the longer it will take for supply to meet existing demand. Gold has a stock to flow ratio of 66, which means it would take 66 years at the current rate of production to produce the amount of gold currently in circulatio Silver has a S2F ratio of 74. BTC has a S2F of about 50. Background Over the past 8 years, Bitcoin has gone through phases of rapid price increase followed by periods of rapid decrease. The price has been driven by increasing market size and a decreasing rate of issuance, and has been so volatile (compared to traditiona finance) that \u201c1 month in cryptocurr markets is like 1 year in traditiona markets\u201d. However volatility is decreasing and we are seeing lower highs and higher lows during each subsequent market cycle. The single biggest factor driving multi-year market cycles appears to be the decreasing rate of supply increase (the issuance rate). The last 3 halvings1 seem to have provoked the last 3 bull cycles. We are in the third bull cycle now (April 2021) and I fully expect it to be followed by a bear cycle. As an amateur investor, I want to buy low and sell high. I\u2019d like to time the top and bottom of the market with reasonable accuracy, just like everyone else. But I\u2019m aware that my methods are less nuanced than profession traders and analysts - I have access to less data than them and I\u2019m not willing to put in as much effort as they are. I\u2019m happy to do this Pareto style - I\u2019ll give it 20% of my maximum effort and I\u2019ll be happy with 80% of an ideal result2. This is a review of what I consider to be the best sources of metrics and analysis that I\u2019ve come across. All the resources used in this article are attributed to the original author and have been made freely available on Twitter. I hope its OK for me to repost them here, if it\u2019s not then let me know and I\u2019ll edit the post. Hopefully helpful links This article assumes some familiarit with blockchain and financial markets. Some more general articles on this site are: Bitcoin compared to Gold How to buy bitcoin Analysts These insights, metrics and charts are the work of the following people and organisati Willy Woo Timothy Peterson PlanB Glass Node CoinMetric Jurrien Timmer Every 210,000 blocks, the number of bitcoin awarded to the miner for successful adding a block is halved. The last halving occurred in May 2020 and the rate of issuance halved from 12.5 BTC/block to 6.25 BTC/block \u21a9I realise this probably isn\u2019t, strictly, what Mr. Pareto was thinking when he published his research. I hope you get my intention. \u21a9"},{"title":"All Known Locations of an\u00a0Executable","category":"snippet","url":"where-command.html","date":"25 April 2021","tags":"unix, macos, cli, bash ","body":"where \u00a0the Expanding phrases: kr -> kind\u00a0regar Multi-line It really seems similar to what I\u2019m using UltiSnips\u00a0 I found this question on SO comparing abbreviati and snippets. TLDR: It\u2019s easier to add and maintain snippets than abbreviati and you have less boilerplat with snippets than abbreviati especially in complex\u00a0ca To fix the social sciences, look to the \u201cdark ages\u201d of medicine Emotional resilience in 3 virtues of a\u00a0programm Laziness - The quality that makes you go to great effort to reduce overall energy expenditur It makes you write labor-savi programs that other people will find useful and document what you wrote so you don\u2019t have to answer so many questions about\u00a0it. Impatience - The anger you feel when the computer is being lazy. This makes you write programs that don\u2019t just react to your needs, but actually anticipate them. Or at least pretend\u00a0to Hubris - The quality that makes you write (and maintain) programs that other people won\u2019t want to say bad things\u00a0abo Also, I read a quote somewhere saying the mark of a great program is having people use it in ways you didn\u2019t expect, or something like\u00a0that. Is WebAssembl magic performanc pixie\u00a0dust Yamauchi No.10 Family\u00a0Off A beautifull Improve and Extend Your Text Objects A Vim Guide for Adept\u00a0User How to manipulate multiple quickfix and What are digraphs and how to use\u00a0them. Useful keystrokes in INSERT\u00a0mod Useful keystrokes in VISUAL\u00a0mod Vim Using shell commands in\u00a0Vim. Deep dive in CORS The ps 1 - see which processes are running or sleeping. WCHAN tells you which kernel event a waiting processing is\u00a0awaitin"},{"title":"Learning - April\u00a02021","category":"Technical/Developer Tools","url":"learning-april-2021.html","date":"21 April 2021","tags":"learning, youtube, ansible, ssh, vagrant, google-cloud-platform, service-accounts, iam ","body":"Table of Contents Google Cloud Platform Ansible SSH Vagrant Google Cloud Platform It seems like I\u2019m looking for some general overview of how roles are managed, viewed, compared, and\u00a0inheri How can you tell if a users (or a service accounts) roles are adequate, or too much or too little for a particular task? And what\u2019s the difference between a user having some roles, and a user using a service account that has those\u00a0role It would also be nice to have some kind of adversaria test, that would identify how/if users or service accounts can create identities with more flexible permission that their\u00a0own. These short videos are good, but they\u2019re not a complete solution. I\u2019m not sure where to look\u00a0next. Ansible Based on Jeff Geerlings book. There are 15 episodes. Jeff seems like a great guy. I\u2019m going to try listen to one of these each\u00a0day. SSH This is also a very useful article. I made notes from it in another post. Vagrant Good for local developmen (Especiall when Not as good for cloud providers as\u00a0Terrafo No more"},{"title":"Tweets - April\u00a02021","category":"Non-technical/Journal","url":"tweets-april-2021.html","date":"21 April 2021","tags":"twitter ","body":"Table of Contents Front-End Mental\u00a0Mod Agency Razors Crypto Front-End 6 website for top landing page inspiratio onepagelov .com by @robhope \u2022 lapa .ninja by \u2022 landingfol .com by @dannypost saasframe .io by .com by @Cruip_com \u2022 saaspages .xyz by @Versoly\u2014 Jim Raptis (@d__rapti April 14, 2021 anyone interested in a fun @microacqu 90 or 180 days from start to finish to build and sell a tiny company in public?wou be really great M&A practice for my Jim Bisenius April 14, 2021 Mental\u00a0Mod Tobi's favorite example of FIRST PRINCIPLES is a Truck driver.His truck was sat still for 8 HOURS on THANKSGIVI waiting for his cargo to be unloaded when he realized\u2026\u201c not take the WHOLE trailer off the back of my ship rather than unloading + reloading each\u00a0item? George Mack May 18, 2020 LUTKE LEARNING 6 - TALENT STACK LED BY CURIOSITY > MBA He didn't have an MBA. He didn't grind 100-hour workweeks. Instead, he played video games (which led to coding) and he snowboarde (which led to an online snowboardi store). This 'Talent Stack' led to\u00a0Shopify George Mack May 18, 2020 A super long thread, worth reading it all: Josh Waitzkin might be the most INTERESTIN person alive.He doesn't have Twitter. And he barely uses the internet.I compiled my favorite 5 MENTAL MODELS of his below.THRE George Mack August 8, 2020 Agency 1/ HIGH AGENCY Once you SEE it - you can never UNSEE it. Arguedbly the most important personalit trait you can foster. I've thought about this concept every week for the last two years since I heard discuss it on @tferriss' podcast. THREAD\u2026\u2014 George Mack November 29, 2018 Razors THREAD: 15 of the most useful razors and rules I've found.Rule of thumb that George Mack January 16, 2021 Crypto Now let\u2019s compare this to the stock-to-f model. Below I added in the S2F model, which is the aforementi inflation rate regressed against price. /10 Jurrien Timmer April 13, 2021 #Bitcoin is looking strong at RSI 92. Still not above RSI 95 like 2017, 2013 and 2011 bull markets. I calculated BTC price needed for RSI 95 at April close: $92K. Let's see what the Coinbase IPO will do today\ud83d\ude80 PlanB April 14, 2021"},{"title":"SSH-Notes","category":"Technical/Developer Tools","url":"ssh-notes.html","date":"21 April 2021","tags":"ssh, linux, security ","body":"Table of Contents TLDR Setup SSH-Agent to prevent Authentica Passwords and\u00a0Keys Handshake Background Source TLDR Public key only on the remote\u00a0ser ssh-keygen -t rsa Generate a key pair and keep the private key privately on your local The new keys are added to ~/.ssh/id_ and You could reuse an existing key pair but if it gets compromise you\u2019ll need to reset cat to upload or copy a public\u00a0key Copy all the output from the relevant line: ssh-rsa /dev/null"},{"title":"The trouble with climbing higher is that eventually you lose sight of the\u00a0ground.","category":"snippet","url":"climbing-higher.html","date":"16 April 2021","tags":"advice, thoughts, meta ","body":"."},{"title":"I Leaked Credentials Onto A Public GitHub\u00a0Repo","category":"Technical/Engineering","url":"i-leaked-credentials-onto-a-public-github-repo.html","date":"15 April 2021","tags":"hack, github, service-account, keys, security ","body":"Table of Contents Don\u2019t post secrets to public Background The\u00a0hack Remediatio Questions Study Comments Don\u2019t post secrets to public I made this mistake a while ago, and in the interests of openness and learning from others, I\u2019d like to describe what happened. Maybe it\u2019ll help others avoid the mistake, and maybe I\u2019ll learn something from any conversati this Background Using Google Cloud Platform (GCP), I\u2019ve been doing some work across multiple compute instances. Thankfully the work wasn\u2019t business critical or on production systems. My account was isolated away from the rest of the\u00a0busine As the number of servers I was working with increased, I realised I needed to begin using some tools to automate server setup. This lead me to begin using Ansible, and once I\u2019d cobbled together a working playbook I pushed my Ansible project to my GitHub account\u2026 And accidental leaked the key for an account I\u2019d been\u00a0using The\u00a0hack Within a couple of minutes of pushing the repository to GitHub\u00a0I: Made the Stopped tracking the keys in git and removed them from the cache git rm -r --cached . Received an email from Google saying they\u2019d found OK, close call. The secret was leaked for less than 5 minutes. On my obscure I thought there was nothing to worry about.. But then I noticed some activity in the\u00a0consol Compute instances were being created, I could see the list growing rapidly. Over the next few minutes 195 compute instances and disks were being created, each with a unique name in zones across the world. The format of the name was Where type was either applicatio backup, jenkins, gke, prod, staging, worker, www, build, redis, or runner. Maybe some others too. The number seemed to be 5 random\u00a0dig Some of the instances were ephemeral. They all had delete protection enabled. I checked the details of a few of them and noticed some scripts that included references to\u00a0Monero. So I guess a Monero mining bot was being set\u00a0up. The logs showed that GKE and networking resources had also been requested, but the account which the stolen credential belonged to didn\u2019t have the necessary permission Our project also maxed out its quota of compute instances in multiple regions and\u00a0zones. Remediatio I deleted the account that had been leaked, and began quantifyin the damage. I wanted to know exactly what permission the key had, which resources could be created, and could the leaked account be used to create other accounts? No, it\u00a0can\u2019t. After looking around and becoming confident that it was only 195 compute instances with disks and delete protection that had been created, in regions and zones across the globe, I began to remove them. No other resources had It took me 10 minutes and some googling to create the Get all the compute instances and dump them into a file. I expected to run a script that iterated through the file line by line, setting variables based on the content of the current line: gcloud compute instances list --format zone)' > names.txt In Vim, find the rows that contain the instances that I don\u2019t want to delete, and remove these from the file. There are a handful of compute instances I want to keep, and 195 that I want to remove. :v/node- shows any rows that don\u2019t Loop through the file and for each row, which contains the instance name and its\u00a0zone, Remove Delete the\u00a0instan while IFS=, read -r name zone do gcloud compute instances update $name --zone $zone \\ && gcloud compute instances delete $name --zone $zone --quiet done < names.txt The --quiet flag is necessary because otherwise gcloud will ask me to confirm that I want to delete the Questions I\u2019m surprised by the speed with which the attacker found the leaked credential The repo did not belong to the clients account but my own, and I assume that my account is obscure enough to not be on any interestin lists. If my account is being scanned every few minutes, presumably all accounts are being How many resources are required to do that? I guess if one of these attacks works you can use the stolen compute to scan more repositori for more leaked credential It\u2019s easy to imagine scenarios where large corporatio that are already running complicate cloud infrastruc deployment wouldn\u2019t notice a few (200?) unauthoriz compute Study Service accounts on Google SSH crash\u00a0cour Vagrant crash\u00a0cour IFS= Comments There was some useful discussion about this article on\u00a0Lobste."},{"title":"Broot","category":"snippet","url":"broot.html","date":"13 April 2021","tags":"broot, macos, cli ","body":"Broot is a tool that shows the contents of a directory on one screen, even its got lots of files"},{"title":"Adding Keyboard\u00a0Navigation","category":"snippet","url":"adding-keyboard-navigation.html","date":"12 April 2021","tags":"blog, jam, jquery ","body":"I added keyboard navigation to my blog and it works really well. Now I find myself trying to use the same shortcuts on other\u00a0site"},{"title":"Ansible","category":"Technical/Developer Tools","url":"ansible.html","date":"12 April 2021","tags":"ansible, servers, ssh, automation ","body":"Background I\u2019ve been spending a lot of time lately working on nodes for various blockchain projects (Polkadot, Cardano, Tron, Binance Chain, Ethereum, \u2026). The rosetta api spec is super interestin but like most things in crypto the documentat is sometimes wrong or incomplete and there are bugs and Each of the nodes runs on a separate server, and we typically have one node for mainnet and another for testnet. I\u2019m working across mutiple servers, doing difficult stuff, and I want it to be as easy as\u00a0possibl I need to reduce friction and Accessing the servers is easy - I use Tmux with the continuum and resurrect plugins and maintain different sessions for each type of server. This makes accessing multiple servers during the same work day really simple and effortless But working on the servers is still\u00a0awkw On my dev machine I have zsh with syntax highlighti command completion and various tools, like z to make navigation supper easy. I also have a lot of aliases defined. E.g. .. \u2192 cd ... Working on a remote server should be as convenient and familiar as working on my local machine, so I want to find a way to configure a server the same way as my laptop, and I want to do it automatica so that it can be done many times, with no Ansible Ansible seems to be It\u2019s\u00a0free It\u2019s got all the features and capabiliti you\u2019re going to\u00a0need It\u2019s agentless - you don\u2019t need to install anything on the machine you want to control - you can use Ansible with anything that you can ssh\u00a0into. I used the following resources to get\u00a0starte This useful video gave me some orientatio and helped me figure out what I was aiming for and how to get started. Before watching it, I didn\u2019t know \u201cwhich way was\u00a0up\u201d. This blog post showed me how to create an inventory using the gcp_comput plugin. I spent a lot of time being unnecessar confused about service accounts. I guess until you have 1 success at understand something you don\u2019t know if you\u2019ve misunderst by a little or a\u00a0lot. Once you have an inventory of servers that you want to connect to, you still need to specify (and prepare for) how you will connect to them. I\u2019d hoped that the gcp_comput plugin would do some heavy lifting for me in this step, but it seems not. It can do lots of useful stuff like creating instances and specifying disk space and networks, but it won\u2019t really help you ssh into an instance. No matter\u00a0tho This blog post turned out to be just what I needed. I found it at the beginning of my search when I was trying to create an inventory, and discarded it as almost useful. Turns out that OS Login is the best way to ssh into a GCE instance and once you\u2019ve got your inventory taken care of, this blog post really\u00a0hel When I was installing python modules, I had some errors about pyenv shims being incorrect. The scripts were looking for versions that weren\u2019t present. Running pyenv reshash fixed it. Kind of magically, but\u00a0annoyi Setting up a service account and giving it the correct permission took more time and was more confusing than anything to do with\u00a0Ansib I found this blog post about setting up vim for yml files. The preferred way to install ansible on Mac is using pip. When you use OS Login the username you have when you ssh into the compute instance will change. This SO question explains\u00a0w Commands gcloud auth list ansible-co view|list| -i --graph ansible -i all -m ping"},{"title":"Github\u00a0Actions","category":"snippet","url":"github-actions-blog.html","date":"10 April 2021","tags":"github, blog ","body":"I should see if I can use GitHub actions to generate html from markdown and run some shell and"},{"title":"Socially Acceptable\u00a0Mistakes","category":"snippet","url":"socially-acceptable-mistakes.html","date":"10 April 2021","tags":"meta, thinking, advice ","body":"It\u2019s more socially acceptable to make mistakes and errors of omission"},{"title":"`du` is a tool for showing disk\u00a0usage.","category":"snippet","url":"du-command.html","date":"7 April 2021","tags":"cli, unix, macos ","body":"There is a similar tool, with a list of other similar tools here"},{"title":"Safe Bash\u00a0Scripting","category":"snippet","url":"safe-bash-scripting.html","date":"6 April 2021","tags":"bash ","body":"An example of a safe, good, robust bash file\u00a0skele"},{"title":"Running a Binary in\u00a0Debian","category":"snippet","url":"running-a-binary-in-debian.html","date":"5 April 2021","tags":"debian, binary, path ","body":"I was running a binary in Debian that was complainin about an environmen variable not existing. I moved the binary into a $PATH directory and logged in as a sudo user. Why did this solve the\u00a0proble"},{"title":"Over-Engineering this\u00a0blog","category":"Technical/Web","url":"over-engineering-this-blog.html","date":"5 April 2021","tags":"blog, javascript, self-reflection ","body":"Over the last few weeks I\u2019ve spent an unreasonab amount of time and energy making unnecessar improvemen to this\u00a0blog. Some of these Adding keyboard shortcuts (type ? to find out\u00a0which) Implementi then and then optimizing client side fuzzy\u00a0sear Using src-set to serve responsive images Lazy loading images to make this site load\u00a0faste Compressin page files using brotli and also gzip (Precompre Trying (and ultimately failing) to avoid a \u201cwhite flash\u201d when dark mode is chosen and a new page loads (Github discussion I\u2019m not really sure why I did it. It makes almost no difference to anyone but me. It felt a I like tinkering, and it\u2019s nice to build something that will continue to work with no maintenanc I tell myself that over the next few years I will gain the benefits of these features even when I\u2019ve forgotten I It\u2019s taught me a lot of JavaScript which is a great language to be familiar with - it\u2019s everywhere I would warmly encourage someone younger than myself to pursue interests for the sake of curiosity and fun. And there is a very high chance that even if no-one uses the shortcuts except me, my new javascript skills will come in useful But even if they do I\u2019m not sure its a good enough reason - things should be built when they solve a present problem, not for what-ifs and maybes. YAGNI. I wouldn\u2019t let myself do this in a profession capacity. There is a tension between being curious and I\u2019m not really sure that I need to justify myself. Its a hobby, I wanted to do it, I enjoy tinkering with web technologi and learning new\u00a0things But also, I lost sleep over this - I stayed up too late, and let it put pressure on other\u00a0thin I know that being curious, and making room to play with interestin things, has been one of the most useful approaches to personal developmen and up-skillin myself. But there must be a\u00a0limit.. There is a tension between wasting my time and taking a risk, and it will take a few years before I know for sure if these efforts were worthwhile or\u00a0not. If it\u2019s not fun, don\u2019t do\u00a0it. Successful business owners seem to be very good at leaving things alone once they\u2019re \u201cgood enough\u201d, and not being In fact, I think that being a perfection is antithetic to being an entreprene I am not a perfection I\u2019m just really curious and have a big appetite for\u00a0learni But this \u201cappetite for learning\u201d stops me from focussing. I let myself become distracted by adding new features to this blog, when instead I should zoom out a bit and think about working towards a more substantia and meaningful goal, to the exclusion of more minor\u00a0goal I think that good entreprene are very focussed, to a fault. I am not that focussed. I am too distracted by\u00a0life. It\u2019s a balancing act, there is a tension between being emotionall and physically present with my family and friends, and ignoring as many things as possible so that I can focus on doing something meaningful that is"},{"title":"Fuse\u00a0Search","category":"Technical/Web","url":"fuse-search.html","date":"5 April 2021","tags":"fuse, search, web ","body":"Adding search made the site feel faster and more accessible I\u2019ve reimplemen search on this site using fuse.js instead of tinySearch You can read about how I implemente tinysearch here. When I first implemente search I was surprised how much faster and more accessible the site began to feel. I could quickly access any content by typing a few words, I didn\u2019t need to scroll or follow a link.1 This means I can find content without having to think about how to get there - I don\u2019t need to break my flow or concentrat It might sound like a trivially small considerat but lowering friction or cognitive load in small ways can make the difference between using or not using something when you\u2019re already working hard or concentrat on something else. For example, if I want to look up my notes about using the nohup command, I can quickly go to the site, type / (the keyboard shortcut for search), type \u201cnohup\u201d and hit enter. This is all muscle-mem level impulses. I don\u2019t need to think about the content, think about its category or when I posted it, then scroll down and scan a list, or use a mouse to click on intermedia links. Win. Working at the speed of thought rather than the speed of input is a big deal. Why I switched from tinySearch to Fuse.js Before implementi fuse.js, this site had a search feature powered by TinySearch I wouldn\u2019t have had enough knowledge to implement fuse.js if I hadn\u2019t already learnt some JavaScript whilst setting-up tinySearch TinySearch had an example for Pelican Blogs, and a simple and clear readme. By using tinySearch first I saw an example of how to build the JSON array that becomes the search index, and how to implement the javascript that\u2019s required for client side search. Also, in the course of developing and this blog I\u2019ve become much more proficient and comfortabl with JavaScript (and jQuery) in general. Fuse.js is really quite simple to set up once you\u2019re familiar with JavaScript It\u2019s much more flexible than tinySearch you can choose search weights for different fields, accuracy thresholds and some parameters for the fuzzy search algorithm. The general approach is to instantiat an instance of Fuse by calling Fuse with a JSON array for it to parse, along with some options. You then give the instance a string and get back an array of results which you can do whatever you want with. The accuracy of the search results is higher with fuse.js and the speed is still acceptable I did have to do some optimizati of the search index that Fuse generates, though. Optimizing the search index The \u201cnormal\u201d search index that Fuse uses to return results is a JSON array of all the content of all the articles that you want to be able to search. You can generate it using a jinja template or any other way you want. (There simply needs to be a JSON array that the browser downloads and does a fuzzy search on). This gave me a file that was about 4MB. Once I asked Fuse to search the complete text of each article (not just the default first 600 chars, iirc) then speed really suffered. I optimized the index file in the following three ways: Removed any non-words. Some of my articles are jupyter notebooks that have been converted to articles (the plugin to do this is one of the reasons why I began using Pelican). When the index is built, lots of code and html gets included, which isn\u2019t helpful. Any \u201cwords\u201d that are more than 20 chars I just delete. Removed the 150 most common words. Any word that is in many articles is not useful for distinguis between different articles, so they can be deleted from the index. They don\u2019t add any meaning. I wrote a short pipeline of shell commands using tr, sort, uiq to generate a file with a list of the most common words. I then wrote a python script to update the original search index by removing all the common words. Shortened any long words by only keeping the first 12 characters If a word was 15 characters long, I simply removed that last 3 chars. I figured this would work fine because matching the first 12 characters would already be quite unique and give a good result. Doing these 3 optimizati reduced the file size by about 90%. Compressin the JSON using gzip or brotli makes the files even smaller, and now the amount of data transferre to the client seems reasonably small. (This is a static site, and therefore search has to happen client side.) The browser would still begin to lag as the search string length increased. It takes more time to search for a 10 character string than for a 5 character string, and initially fuse was doing a search every time a character was typed. I wanted the site to feel as fast as possible and thought that if search was paused whilst typing and occurred a short time after the last key was pressed this would be an improvemen I added a short delay of 200ms to the function call, and typing during the delay time resets the time. This reduced the lag and made the search tool feel responsive I learnt that this is called \u201cdebouncin There was some further complexity when I wanted to debounce characters used for searching, but not the navigation or keyboard shortcuts. Getting the debounce function to only run on some key presses was surprising complex. It taught me a lot of JavaScript though, and it\u2019s satisfying to have made a useful user interface. also immediatel gave me the idea to add keyboard shortcuts. Type ? to see what happened \u21a9"},{"title":"Creating\u00a0Slowly","category":"snippet","url":"creating-slowly.html","date":"1 April 2021","tags":"meta, thinking, advice ","body":"As a hacker, or creator, or whatever the best label is, I always want to create something (usually code) and have it\u00a0finishe But a strange creativity and productivi boost comes from dabbling, dipping in and\u00a0out. I think that if the technical challenges aren\u2019t too hard, then the main criteria for success is\u00a0creativ Creativity needs time away from the project, and sleep, to bubble up and let ideas\u00a0grow Ultimately the most successful path is usually the most interestin because success has more consequenc than failure. \u201cInteresti requires elements of novelty and surprise, and without creativity these elements can\u2019t flourish.\u00a0 Dabbling results in more creativity than 6+ hours of strenuous work, and is more likely to give you"},{"title":"Arrow syntax in\u00a0bash","category":"snippet","url":"arrow-syntax-in-bash.html","date":"1 April 2021","tags":"bash, syntax ","body":"bar << foo bar will stop reading input when it reached\u00a0\u201cf bar <<< \"foo\" foo is all the input. bar wont bar < <(foo:list process subscripti Kind of like piping in the output of Stack\u00a0Over"},{"title":"Pretty print JSON","category":"snippet","url":"pretty-print-json-in-typescript.html","date":"31 March 2021","tags":"typescript, json, syntax ","body":"null, 2)}`);"},{"title":"Vim: GoTo Tag\u00a0Definition","category":"Technical/Developer Tools","url":"vim-notes-goto-tag-definition.html","date":"31 March 2021","body":"Update (2021-03-3 Just use neovim.coc instead of YouComplet or Syntastic. It\u2019s faster, easier to setup, and works intuitivel ALE is still wonderful and useful, though there\u2019s a lot of overlap - coc can lint as well. Jump Lists and Change Lists If you\u2019re going to be jumping around to where things are defined, you will need to know how to jump back again. It seems there are two lists you need to be aware of, the jump list1 and the change list2. Jump List A list of locations that the cursor has jumped to. move up the jump list mode down the jump list Jumping to a definition or a search result Change List g; and g, \u2192 move up and down the change list A list of locations where a change was made. '. \u2192 go to the location of your last edit (. is a mark). '' \u2192 go back to where you were before your last jump Original Post: There are multiple ways of doing anything with vim, including going to where a function or object is defined, and I usually need to do something at least 3 times before I can do it without breaking my focus or train of thought. My memory is hazy but I remember spending a 1/2 day looking into this and considerin which solution I wanted to commit to.3 My options seemed to be between YouComplet and ALE. [Update!4] I can\u2019t remember everything I read and tried, but I trust my conclusion Looking in my .vimrc I see that I have x mapped to :YcmComple GoTo and it works just fine, even when a module is imported from somewhere outside the current project. The tool was working and ready to use, I just hadn\u2019t internaliz it yet. Commands to remember: x - GoTo definition - YCMs best guess at an \u2018intellige goto command, whether its a declaratio or a definition - Toggle tagbar :help jumplist \u21a9:help changelist \u21a9The more powerful the tool, the more worthwhile it is to take a closer look at what it can and can\u2019t do. \u21a9YCM and ALE work fine for goto definition and linting, but they don\u2019t give me satisfacto looks like it might offer some improvemen \u21a9"},{"title":"Useful\u00a0Business","category":"snippet","url":"useful-business.html","date":"30 March 2021","tags":"entrepreneur, saas ","body":"looks like a really"},{"title":"Frantic\u00a0Distraction","category":"Snippet","url":"frantic-distraction.html","date":"30 March 2021","tags":"meta, thinking ","body":"Frantic distractio via productive is exhausting and\u00a0useful"},{"title":"Rearrange splits in\u00a0Vim","category":"snippet","url":"vim-split-rearranging.html","date":"29 March 2021","tags":"vim ","body":"x - swap buffers, but keep arrangemen the\u00a0same H - make the current split cover the left of the\u00a0screen J, K, L covers the bottom, top, right of the\u00a0screen blog\u00a0post stack\u00a0over"},{"title":"Pelican Plugin\u00a0Guide","category":"snippet","url":"pelican-plugin-guide.html","date":"29 March 2021","tags":"pelican, plugin, guide ","body":"A guide about writing plugins for\u00a0Pelica Thanks @geographe"},{"title":"Read and Write the Same File in\u00a0Bash","category":"snippet","url":"read-and-write-same-file.html","date":"24 March 2021","tags":"shell, pipe, syntax, bash ","body":"I tried to read and write the same file in a pipeline, and got caught out by a race condition (why is the file empty?!). Do this\u00a0inste some_scrip < file > smscrpt.$$ \\ && mv smscrpt.$$ file || rm smscrpt.$$ || removes the temporary file if it\u00a0errors. $$ is the process ID and ensures that you always have a unique temporary file\u00a0name."},{"title":"JSON\u00a0tools","category":"snippet","url":"json-tools.html","date":"24 March 2021","tags":"json ","body":"jj - A stream editor jq - A json processor python -m json.tool I like jq for pretty printing JSON output, jj for making JSON pretty or\u00a0condens This was really useful when optimizing the search index for this\u00a0blog."},{"title":"Docker\u00a0Commands","category":"snippet","url":"docker-commands.html","date":"17 March 2021","tags":"docker ","body":"docker run -d ... docker logs -f docker run -it ... docker run -itd docker container attach -> detach from container interactiv stack\u00a0over"},{"title":"Python\u00a0Notes","category":"Technical/Developer Tools","url":"python-notes-2.html","date":"17 March 2021","tags":"python, learning notes ","body":"__call__() In Python, every time you call a function or method such as my_functio or the interprete will replace the ( with .__call__( >>> def >>> return x+1 >>> 3 class Prefixer: def __init__(s prefix): self.prefi = prefix def __call__(s message): return self.prefi + message Then use prefixer like\u00a0this: >>> simonsays = says: \") >>> up high!\") 'Simon says: jump up high!' Every time you call a function or method, you\u2019re really just calling a built in __call__ method. There should be one, and preferably only one, obvious way to do\u00a0somethi It\u2019s in the \u2018zen of Python\u2019, which is a set of guidelines that help make design decisions. It\u2019s a choice that Python made, and other languages do There are different levels to languages and this applies more to the idiom level than the design pattern level. It applies even less at the architectu level where there can be several equally good ways of organizing business logic and Perl has the \u201cTMTOWTDI\u201d (tim towtdi) principle - \u201cThere\u2019s More Than One Way To Do It\u201d. Perl\u2019s philosophy is to give users more than one way to do\u00a0somethi"},{"title":"Adding\u00a0Search","category":"Technical/Web","url":"adding-search.html","date":"12 March 2021","tags":"blog, search, tinysearch, web ","body":"I\u2019ve added search to this blog. Results are generated as you type. Try it by typing / or cmd-k. If you look on the Pelican plugins index you\u2019ll see that Tipue search is the only search tool with a ready-made Pelican plugin, but unfortunat the project seems to have died and the projects website is now But searching a static site must be quite a common need and googling for alternativ gave me a few choices. Lunr.js seems to be the most popular, but it also seemed fairly complicate and like it was probably more than I needed. I went with Tiny Search because it seemed to do what I needed and was easy to setup. There\u2019s even an example for Pelican\u00a0bl One hurdle to success was minimising the false positives. The default settings seem to prioritise keeping the size of the index small (tiny) over giving a good user experience Maybe its because the amount of text on my site is significan less, or more, than the typical use case. Either way, after checking the project\u2019s issues on Github I found an issue that matched my problem perfectly. The solution is to increase the tiny_magic variable at build\u00a0time According to the Readme, this requires using a container and building the index using docker run.... Unfortunat the Dockerfile wouldn\u2019t complete without errors. Checking the issues again and adding to the discussion resulted in an alternativ Dockerfile being suggested, which works. Woohoo! I could then build the search index with a massive tiny_magic value\u00a0(204 Then something weird happened. I write in Vim and I use fzf to find and open files. I realised that fzf had stopped working. After some investigat I realised it was only not working in the blog project, and that fzf.vim calls the fzf CLI tool, which in turn calls the ripgrep tool. The underlying issue was that ripgrep wasn\u2019t working, and after a few hours (sob) of debugging, I found out that one of the things that makes rg special is that it ignores stuff in your .gitignore file. Sneakily, and without me noticing, the Docker image for constructi the tinysearch files had created a .gitignore file with a single entry. The entry was *, which selects everything So rg was ignoring everything and giving no results. Which meant I couldn\u2019t find and open\u00a0files I still don\u2019t know how (or which part of) the Dockerfile does this, so I\u2019ve created a file which contains the correct content, and after I generate a new search index I replace the new traitorous .gitignore with the contents of I\u2019ll come back to it later when/if I have a better understand of Dockerfile syntax, or\u00a0Rust. Adding search to the site made the content feel a lot closer and more accessible Once it was working I immediatel wanted to use some keyboard shortcuts to open the search box and select results. Kind of like does it. It feels really fast and\u00a0precis Googling for some jquery packages, and also some vanilla javaScript showed me enough to get things working. You can hit / or ctrl-k or cmd-k and bring up a search box that populates results as you\u00a0type! Only whole words are matched unfortunat but its still a super useful feature. The search index includes article content as well as article titles and categories I\u2019d like to tweak a few of the keyboard shortcut behaviours and add the contents of various pages (which aren\u2019t articles) to the search\u00a0ind Update I\u2019ve reimplemen search using fuse.js. You can read about it here"},{"title":"Pipe a Script File into\u00a0Bash","category":"snippet","url":"pipe-a-script-files-into-bash.html","date":"11 March 2021","tags":"bash, syntax, shell ","body":"Probably it\u2019s one you just curl-ed curl -sSfL | sh -s"},{"title":"Split Long\u00a0Strings","category":"snippet","url":"split-long-output-onto-multiple-lines.html","date":"10 March 2021","tags":"bash, linux ","body":"Split long strings (or command outputs) onto multiple lines Find and replace a particular char (maybe :) with a \\n. ... | tr ':' '\\n' ... | sed 's/:/\\n/g'"},{"title":"ripgrep\u00a0Regret","category":"snippet","url":"ripgrep-regret.html","date":"10 March 2021","tags":"ripgrep, fail ","body":"Without noticing, create a .gitignore file with a single * in\u00a0it. Spend a day trying to understand why ripgrep has stopped working for only 1\u00a0project. \ud83d\ude2d\ud83d\ude2d\ud83d\ude2d"},{"title":"Teaching Kids About\u00a0Money","category":"snippet","url":"teaching-kids-about-money.html","date":"5 March 2021","tags":"parenting, kids, money, teaching ","body":"Teaching my kids about money and work is having an\u00a0effect. Yesterday, my daughter made a painting for me and asked my to buy it using pretend\u00a0mo"},{"title":"Stop Prepending sudo to Docker\u00a0Commands","category":"snippet","url":"stop-prepending-sudo-to-docker-commands.html","date":"5 March 2021","tags":"sudo, docker, linux ","body":"sudo groupadd docker -> make the group sudo gpasswd -a $USER docker -> add $USER to the docker group newgrp docker -> activate the changes"},{"title":"`cat` and a new\u00a0line","category":"snippet","url":"cat-and-a-new-line.html","date":"5 March 2021","tags":"cat, linux, bash, shell ","body":"If you\u2019re cat-ing a file and the bash prompt doesn\u2019t start on a new line (cos the file you displayed using cat doesn\u2019t end with a new line char) the following will fix\u00a0it: cat ; echo"},{"title":"Cardano: Generating\u00a0Addresses","category":"Technical/Cryptocurrencies","url":"cardano-generating-addresses.html","date":"5 March 2021","body":"If many different customers are to deposit or send ADA (The unit of currency on the Cardano blockchain to a Cardano node, it will be necessary to determine which customer is responsibl for each transactio so that the correct customer account can be\u00a0updated As with many things involving blockchain this initially seemed like a simple requiremen but involved several hours of\u00a0work. Cardano wallets are generated using a parameter called The default value is 20, and is the number of unused addresses that the node will generate and return to a client using the REST API. If one of the addresses is used, the node will automatica generate another so that there are always 20 This is probably very convenient for personal use. If I want someone to send me some funds, I can make a simple api call using cURL and get a fresh address. But if you are running a service, weather its e-commerce or a financial service, its not really good enough. Some advice on the forums says to generate a wallet with a very large value such as 10,000 and just generate a new wallet when you run out of fresh addresses, but it still feels like a\u00a0compromi But lets explain our situation in more detail first. If a customer wants to send us some ADA, we want to give them a fresh address that\u2019s never been used before and that only they have. Then we know that any funds that arrive to that address are from a particular customer. However we don\u2019t know if the customer will actually use the address and transfer any funds. The address might remain unused or it might not. Neverthele that address is now reserved for them, and no one else can use\u00a0it. In this way, we might need to generate and maintain a list of thousands of addresses that are never used. Using for this seems like a bad\u00a0soluti Fortunatel has the answer, albeit in a fairly convoluted and obscured form. If you have the mnemonic that was used to generate a wallet originally you can generate 2^31 unique addresses like\u00a0so: Clone the repo and build the docker\u00a0ima git clone docker build -t . Get the mnemonic and generate a file containing a list of space separated words on one\u00a0row. Run the\u00a0follow export increment= && cat mnumonic.t | docker run --rm -i key Shelley | docker run --rm -i key child | docker run --rm -i key public | docker run --rm -i address payment --network- testnet > payment.ad && cat payment.ad ;echo"},{"title":"creating users with sudo\u00a0permissions","category":"snippet","url":"creating-users-with-sudo-permissions.html","date":"4 March 2021","tags":"sudo, linux, user, admin ","body":"adduser -m usermod -aG sudo CentOS: adduser -m passwd usermod -aG wheel (wheel is a usergroup with"},{"title":"Two Years Of\u00a0Vim","category":"Technical/Developer Tools","url":"two-years-of-vim.html","date":"4 March 2021","body":"I\u2019ve been feeling very comfortabl with my Vim + Tmux setup recently. Navigating around shells and files isn\u2019t taking much mental effort It\u2019s taken about 2 years of working full time with vim to get to the stage where the commands are so I pepper text files outside of vim (email, notes, etc) with vim keys accidental - j k x etc I can\u2019t remember what the command is to do something if I\u2019m not actually doing it. When I need to do an action, I do it from muscle memory and I only pay attention to the underlying key press if something goes\u00a0wrong This is noticeable when trying to find an unbound key combinatio for some new action, or when reading an article about vim and thinking \u201cthat\u2019s new\u201d when actually I\u2019ve been doing it A pleasant surprise has been that it doesn\u2019t take much effort to rebind a single command and retrain myself to use it. This is presumably because the mental effort for all the other commands has become negligible In the early days, retraining a key combinatio took a lot more effort because I was already making an effort to get used to doing things in\u00a0Vim. I can work even when my vision is blurry (and my speech slurred and my head heavy) because I can use text objects and navigation commands to get to where I know text is. I\u2019m not saying I should work when I\u2019m that tired, but I can, if I\u2019m already familiar with"},{"title":"Disk Full and Disk Usage\u00a0Commands","category":"snippet","url":"disk-full-and-usage-commands.html","date":"3 March 2021","tags":"linux, du, df, shell, cli ","body":"df -h Show disk\u00a0space du -hs . See how big the current dir\u00a0is"},{"title":"Git LFS","category":"snippet","url":"cloning-git-repos-using-lfs.html","date":"3 March 2021","tags":"git, git-lfs ","body":"Cloning large repos, or repos with large files in them, doesn\u2019t work with git clone ... you need to use git lfs clone ... So why is git lfs clone deprecated What\u2019s"},{"title":"Binance-Chain: Running a\u00a0node","category":"Technical/Cryptocurrencies","url":"binance-node-api.html","date":"3 March 2021","body":"I\u2019ve been setting up a binance-ch node. Unlike Polkadot or Cardano, I\u2019m not going to run it from a container until it\u2019s working reliably. The Binance docs show a couple of ways to install a node. I used the install.sh script and went with default values as much as possible. Installati My first attempts to sync a full node used the install.sh script, but the node wouldn\u2019t sync completely it would get stuck. I setup a new VM and did a manual install (\u201cOption Two\u201d) and so far the node has been syncing without any issues. You need to download the genesis file separately in this case. Also, be sure to download the node-binar repo using git lfs and not just git. It will look like it worked but bnbchaind wont have completely downloaded unless you use lfs It took me a while to realise that the documentat assumes that you have an environmen variable called BNCHOME. You can either create it using export (like you would for any environmen variable) or replace the environmen variable in the start node command with the file path: nohup bnbchaind start --home BNCHOME & Note: I\u2019m not sure if the bnbchaind needs the environmen variable to be set or not. It doesn\u2019t give errors if it isn\u2019t set, but I seem to be having more success when BNCHOME is defined. Syncing the node There are three ways to sync a node. Fast-sync isn\u2019t the fastest way to sync your node, hot-sync is. Using install.sh should put the correct default values in the file, but I needed to adjust ping_inter and pong_timeo to the recommende values. Surprises The documentat assumes you have familiarit with running tasks in the background of a terminal session, and that you\u2019re familiar with nohup. I wasn\u2019t - I\u2019d even forgotten what the & symbol does1 so I did some research and wrote some notesIt starts a process in the background You can move it to the foreground with fg or see a list of running jobs using jobs. You can move a running job to the background (like a vim session) using ctrl-z \u21a9"},{"title":"nohup","category":"snippet","url":"nohup.html","date":"2 March 2021","tags":"linux, cli ","body":"Use nohup to keep a curl process running even when the terminal (tty?) session autocloses at\u00a03am."},{"title":"Shell\u00a0Comparisons","category":"snippet","url":"shell-comparisons.html","date":"2 March 2021","tags":"zsh, bash, bsh, linux, cli ","body":"You can group shells into\u00a0group ksh - korn shell and\u00a0zshell sh - bourne shell and bash (the bourne again\u00a0shel Because zsh isn\u2019t a superset of\u00a0bash. bash is a superset of the bourne\u00a0she"},{"title":"nohup and Background\u00a0Processes","category":"Technical/Developer Tools","url":"nohup-and-background-processes.html","date":"2 March 2021","body":"Stop stuff\u00a0stop If you run a command in a terminal session and the terminal session is disconnect then the processes running in it will also be\u00a0termina I discovered this when I was trying to download a ~500GB database overnight. I logged in the next morning expecting to see a completed download, but found I only had half the\u00a0file. Use nohup to ignore HUP signals One solution to this seems to be to use nohup, a command that ignores the HUP signal. It stops your programme from stopping if the terminal session its running in is\u00a0stopped By convention the HUP signal is the method used by a terminal to warn dependent processes that it is about to\u00a0logout. You probably want to run nohup in the background You might want to prevent it from creating nohup.out. Close or redirect fd0 -\u00a0fd2 On Linux, nohup automatica closes stdin. If you\u2019re using MacOS or BSD this doesn\u2019t automatica happen, so you might want to redirect it yourself. This is because if a background process tries to read anything from stdin then it will pause itself whilst it waits for you to bring it to the foreground and type some input. This is probably a waste of\u00a0time. If nohup detects that you have redirected stdout and stderr then it won\u2019t create nohup.out. As with all commands, if you put & at the end of the command, it will run in the background You can bring it to the foreground by running fg, or see a list of jobs by running jobs. If you redirect input to /dev/null ( log.txt This will redirect stderr into a\u00a0file: $ asdfadsa 2> error.txt If you run a command that generates lots of error messages along with \u201cgood\u201d messages, you can redirect all the error messages (stderr) into /dev/null so that you can only see the useful stdout $ grep -r hello /sys/ 2> /dev/null If you want to run a command and only see the errors, (stderr) then you can filter out all the stdout by redirectin the stdout messages to /dev/null: $ ping google.com 1> /dev/null Redirect all output into /dev/null if you want a command to run\u00a0quietl Redirect all the output. The command below redirects stdout to /dev/null (the default file descriptor is 1 if it isn\u2019t specified) and then redirects file descriptor 2 into file descriptor $ grep -r hello /sys/ > /dev/null 2>&1 Read input from a file instead of the\u00a0termin 0>logfile Combining 2>&1 means send stderr wherever stdout is going. This means that you\u2019ve combined stdout and stderr into one data stream and you can\u2019t separate them anymore. It also means you can pipe stderr the same as you can stdout. Input You can redirect stdin similarly. If you run Ranges Searching Undo Splits Macros Other Verbs s - delete char under cursor and enter Insert\u00a0Mod r - replace char under\u00a0curs c/hello - change until next occurrence of\u00a0\u2018hello\u2019 Registers \"ayy yank the entire row into register a. \"Ay yank to register A and append the new text to the existing contents of the\u00a0regist :registers - preview the contents of Insert\u00a0Mod - delete back one\u00a0word. - delete back to the start of the line or start of cgn - if you are searching for a word (either by using / or * or #) and you want to change each instance of the search result, use gn to change or delete and then go to the next result. This will let you use the .dot operator to repeat both the steps (moving and\u00a0changi 0 - paste. if there are new-line chars Normal\u00a0Mod or Select a column of numbers you want to increment, then g will turn them into an Ranges :put =range(1,1 - insert a list of :for i in range(1,10 | put | endfor - use a loop to generate a long\u00a0list. Searching g# or g* for partial matches, like # or * for exact\u00a0matc Search for the word under the cursor, or similar: Press /. - this will copy and paste the word under the cursor into the search box. Edit it as\u00a0necessa After you\u2019ve done your search, to jump back to where your cursor was\u00a0before Find and replace whole words only: Find and replace either old-word1 or old_word2: g - show some stats about current bugger - word count, line count, char\u00a0count Undo g- and g+ - undo\u00a0branc Under changes within a period of time: :earlier 2d - undo changes in the last 2\u00a0days :later 5m - redo all changes in the last 5\u00a0minutes :earlier 3f - undo all changes in the last three buffer\u00a0wri s seconds, m minutes, h hours, d days, f saves @a - Use the global command to execute macro a on all lines of the current buffer containing - For every line containing \u201cgood\u201d substitute all \u201cbad\u201d with\u00a0\u201cugly Splits r - rotate the splits from left to right but only if they are split vertically R - rotate the splits from right to left. H - move the current split to the far left and make it full height. J - move the current split to the bottom of the screen and use the full\u00a0width :only - close all splits except the current\u00a0sp Macros @o - do the macro stored in buffer O on all lines that match the Other in Insert Mode will jump you into Command Mode for one command only and then put you back into Insert The .dot command only repeats commands that changes the buffer content. It wont repeat"},{"title":"Notes From \u201cPowerful\u00a0Python\u201d","category":"Technical/Developer Tools","url":"python-notes.html","date":"21 January 2021","body":"The parts of Aaron Maxwell\u2019s Powerful Python newsletter that I don\u2019t want to forget: Table of Contents Emergent Abstractio Practioner Engineer, Scientist Sentinel Values Levels of Python Code Read PEPs Emergent Abstractio Get used to expecting and letting abstractio emerge from projects. If you find yourself repeatedly solving similar problems in similar ways, what can you do that will simplify the code and the Is it a couple of convenienc methods on some helper class? The code below gives you three ways of instantiat the twitter API client within the same class: A generic \u201cnormal\u201d way A specialize way that looks for certain environmen variables A specialize way that looks for a configurat file import os import twitter # class ApiClient: def __init__(s consumer_k self.api = twitter.Ap consumer_k @classmeth def return cls( @classmeth def path): with open(path) as config_fil # ... return cls(...) # ... Practioner Engineer, Scientist Practioner - You can use a thing (a framework, a tool) Engineer - You can use a thing and if you needed to, you could recreate it Scientist - You can create frameworks and paradigms that have never existed before Aim for the engineer level. Sentinel Values Instead of setting your sentinel value to something that is not quite impossible like None or \"None\" set it to object() This is better because it creates a unique instance of the object class and there can be no ambiguity about where it came from. A sentinel value is a value you can set a variable to. It\u2019s special because it differs from all other legal or possible values that the variable could have. It\u2019s used as a signal or as a canary that something (bad or unexpected has happened. Levels of Python Code Syntax - understand what indentatio is important, when you need parenthesi colons, etc Idioms - the building blocks of a program. \u201cParagraph of code that follow common patterns, like for loops, __init__() methods (boilerpla or context managers. Design Patterns - Less well defined that Idioms, but more useful. More info. Creational Patterns, like Factories Structural Patterns, like Adapters or Proxies Behavioura Patterns, like Visitor or Strategy These tend to be the same across different languages. Architectu - the largest structures in your software system. The language itself doesn\u2019t make a lot of difference an applicatio would have the same architectu whether it is written in Python or Java. The interface between different components would be different, but the \u201corgans\u201d of the body would essentiall be the same. Read PEPs A Python Enhancemen Proposal is a document that\u2019s written to propose a new feature of Python. It fully details the proposed feature, the arguments for and against it, and lots of sample code. If the PEP is accepted into a future version of Python, the PEP becomes the authoritat document for that feature and how to use it. PEPs tend to be written by the best programmer in the world, so hang out with them. Abstractio is a principal of OOP \u21a9"},{"title":"Mental Models I Used To\u00a0Use","category":"Non-technical/Learning","url":"mental-models-i-used-to-use.html","date":"20 January 2021","body":"The rules1 and mental models that helped me succeed in one season or phase of life may not be the best for the next phase. Here is a list of a few mental models I remember being concious of in previous years. Probably I\u2019ve already forgotten some. Always ask \u201cwhy\u2026\u201d. Be obsessive about this. It\u2019s going to make things harder for a while before things get easier. You\u2019ll find difficult answers that you otherwise wouldn\u2019t. If you\u2019re only concerned with the present then its true that ignorance is bliss, but otherwise it\u2019s a liability. \u201cWhat if\u2026\u201d is another good question to ask a lot. Adapt to the situation, don\u2019t make it adapt to you if you have any choice. Be kind of like water, going around things and through the gaps. Look for the edges and the gaps, the parts that aren\u2019t well known. Let people talk as much as they want to. Shut up and listen. If they mean you harm or don\u2019t respect you then it\u2019ll become more obvious the more they keep talking. If they mean you well or they\u2019re saying something useful, you will benefit more from letting them talk more. Inversion - it can be hard to know if you should do something, but how would you feel if you didn\u2019t do it, or if it didn\u2019t happen? Regrets are inevitable everyone has them. Same as making mistakes. Let your regrets be for things that you did do, and not what you didn\u2019t do. If you are willing to try something, fail at it, and still be glad that you tried, then you should almost definitely do it. Commit to it and enjoy the experience Don\u2019t be scared, or at least, be scared and optimistic and happy2. There is beauty and luxury in being in such a bad spot that you are backed into a corner with seemingly no way out. Things become black and white, instead of shades of gray, and that will make priorities and options much clearer. You are likely to work very efficientl and effectivel in this scenario, and you will learn important things about yourself. Now that I\u2019m older and I have I can\u2019t ever let things become so bad that a situation becomes black and white. I have to navigate a world of grays. If they do become black and white, I\u2019ll already have a long list of failings. When I was younger, things were more fragile. My resources were smaller and things could quickly flip from good to bad. Enjoy the few benefits that a situation like that gives, because (hopefully once its gone its gone for good. The best way to solve a problem is to prevent it from occurring in the first place. Succeeding at this will bring its own challenges Take responsibi for things you are not responsibl for, kind of. Do it deliberate and for your own benefit, but don\u2019t forget that you are only pretending that it\u2019s your If you do this, you will force yourself to understand a situation more deeply and from other peoples perspectiv This will let you learn faster and help you in future. Keep this at arms length though - it\u2019s make-belie and you need to be able to switch it on and off. It\u2019s a toy for you to play with. This seems to be what \u201cextreme is. I think its important to have mental models that you\u2019re comfortabl with, because it lets you make decisions quickly and consistent But understand that the map is not the territory, and these are just tools in a toolbox.or policies \u21a9Courage isn\u2019t the absence of fear. It\u2019s being scared and doing the right thing anyway. \u21a9"},{"title":"What\u2019s So Different About\u00a0Now","category":"Non-technical/Social","url":"whats-so-different-about-now.html","date":"20 January 2021","body":"I think we are less aware of our ignorance than previous generation It is easy to implicitly assume that all useful informatio is available to us, and that we are therefore more informed than we really\u00a0are I think this is because the internet has made informatio more accessible and global air travel have made the world feel\u00a0small Whilst an individual would hopefully never pretend to know everything I think its easy to assume that the right informatio exists and is being used by the people to whom it is\u00a0relevan But the accessibil of all informatio has put us in a situation similar to informatio scarcity. We still need to actively search for the informatio we want, because the informatio that comes to us easily or for free is not equal to what we find when we apply\u00a0effo I can easily have so many short pieces of news or informatio that I am always slightly overwhelme The pace of modern communicat encourages me to never slow down enough to form my own questions or frame my own arguments. I can always find an answer to my questions, but when was the last time I checked that whoever gave it to me wasn\u2019t going to profit from\u00a0it?"},{"title":"Predicting the Future using Human Nature and\u00a0Technology","category":"Non-technical/Social","url":"what-happens-next.html","date":"20 January 2021","body":"Predicting the future sounds like a tough problem, but we try to do it all the time without realising\u00a0 We predict the future when we think about how risky or scary something is, or when we think about what\u2019s really going to change because of an announceme or press release. We try to predict the future when we\u2019re at the supermarke checkouts and we try to pick the queue that will move the fastest. I always seem to pick the wrong\u00a0one. There must be a million ways of trying to predicting the future but all the good ones are models which reduce complexity and emphasise key One of them could be comparing the influence of human nature and technology on the outcome, and then comparing the event to what\u2019s Human nature doesn\u2019t change, so if something is driven by fear or greed then it probably doesn\u2019t matter what century it occurs in. Technology is change, and if something is enabled or prevented due to technologi progress then the date is\u00a0importa What is driving the scenario? Is it human nature or technology Supermarke checkouts are mostly manual and require a couple of adults to work together, so human nature has a much bigger role on efficiency than tech. Young men will stack and pack quickly, old women will be the opposite. What types of shopping bags they have, or how they pay, or even how many items they\u2019re buying, are probably not going to lead you to the The same probably works for getting through"},{"title":"Financial Doom And\u00a0Gloom","category":"Non-technical/Social","url":"financial-doom-and-gloom.html","date":"19 January 2021","body":"Financial crises seem to happen fairly regularly so they shouldn\u2019t be unexpected But no-one seems particular concerned about our current financial system, at the moment our attention is controlled by other threats. I\u2019m concerned that a lot of money has been injected into the money supply but we haven\u2019t seen any inflation. And I am concerned that the price of stocks is no longer related to the value created by the company but instead by macro economics. It\u2019s a terrible time to be a value investor. This should be an alarming statement. Value investing should always be a decent way to make money unless markets are broken. If the price of something doesn\u2019t represent its value then a correction is inevitable Interest rates are really low at the moment so if you have spare money and you want to make it work for you then where do you put it? Not into a bank account, because interest rates are low1, and not into government debt, because the yield is so low. It has to be stocks if you want the value of your investment to increase meaningful But everyone is doing this which drives the price up, and because their price is increasing they increase even further. I think that the main reason for concern is super low interest rates and massive increases in the money supply, but there are a couple of other factors that are also contributi It\u2019s easier than ever for retail investors to participat in the stock market, and this seems like a good idea. However if retail investors have influence to effect prices, and they themselves can be manipulate or influenced regarding what or when they buy or sell, then that is likely a new kind of threat to financial stability. We\u2019ve never seen social media combined with quick, cheap investment services for amateurs before. Index funds are also more popular than ever2 - the efficacy of index investing relative to traditiona funds that use stock pickers is very high over medium or long time horizons because index funds are much cheaper. But if index funds become too large then they end up influencin the market in predictabl and rigid ways. Index funds cannot choose what they buy or how much they buy - they just track the index. If a company\u2019s stock crosses certain thresholds their stock has to be bought or sold. It seems like its possible to create feedback loops where funds have to buy more of a rising stock, which increases its scarcity and price, which then requires index funds to purchase more of the same stock. The amount of euros in existence in 2019 was 90% more than in 2010.3 But inflation between 2010 and 2020 is 13%.4 Why is that? If the price of something doesn\u2019t represent its value, then a correction is inevitable are interest rates low? Because confidence in the economy is low, so central bankers have to lower interest rates to make it 1. Cheaper for a business to borrow money to invest in their business and therefore easier for a business investment to be profitable and 2. So that its more attractive for investors to use their capital to invest in a business (which grows the economy) relative to depositing spare cash in a bank account (which is safer but a less efficient way to deploy capital). Interest rates affect the relative risk-rewar ratios of different investment strategies \u21a9Index Funds Are the New Kings of Wall Street \u21a9statista \u21a9"},{"title":"Debugging the more_categories plugin for\u00a0Pelican","category":"Technical/Developer Tools","url":"debugging-more-categories-pelican-plugin.html","date":"19 January 2021","body":"I\u2019ve realised that one of the plugins I use to make this blog is not working correctly. I use the plugin to: add subcategor assign multiple categories to articles. Subcategor aren\u2019t working and Pelican thinks each article just has categories than contain forward slashes. In his \u201cPowerful Python\u201d emails, Aaron Maxwell recommends looking at the source code for popular python libraries to see how really good Python is written, and how talented developers write code and solve problems. This is a good opportunit to look at the code that powers the plugin and see if if I can: Understand the source code Locate the source of the problem Fix the problem I don\u2019t know if Pelican is amazingly good quality or not, I get the feeling it could do with more developer resources, but I\u2019ve got a real reason and motivation to look at the underlying code so I\u2019m going to give it a shot. The documentat is sparse which doesn\u2019t help, I get the impression that whoever wrote it feels like Pelican is simple and it\u2019s obvious what\u2019s going on 1. It\u2019s not obvious to me. Pelican Plugins Every plugin has to have a register() function, here it is for the plugin: def register() I understand the idea of signals from Django, and generators are discussed a bit in the documentat So what else is happening\u2026 As I write down my understand of the plugin, I\u2019m aware that my understand is definitely incomplete and probably wrong. I hope that as I progress I will see the mistakes in what I\u2019ve already written. is called first, and it takes two arguments, generator and metadata. The entire function is 3 lines so here it is: def metadata): categories = = for name in categories = It looks like it gets the category from the metadata for each article. Presumably by the time this function is called the articles have already been parsed and a metadata object has already been created and populated with metadata about the articles, including categories The first row of splits up the categories if multiple categories are listed. metadata must be a dictionary and there must be a metadata dict for each article, otherwise you couldn\u2019t just get get the value assoiciate with the dictionary key and then split the string on commas. This means that this function is called once for each article. I don\u2019t know what text_type does yet. Maybe it ensures that the output is always a string. It\u2019s imported from six which I remember seeing being a dependecy of some other packages. .. Having checked the documentat for six it looks like I was right - it represents unicode textual data in both python2 and python3. Pelican was originally written in Python2 I guess. Next step is to write a new key-value pair to the metadata dictionary for each article. This plugin adds functional to python by enabling categories and not just a category for each article. It seems clear that adding a categories key to the metadata dict is an obvious way to do this. The value for the categories key is a list where each item is an instance of the Category class. This class is instantiat using two arguments, name which is the string from the previous row, and which is currently not understood .. printing the contents of shows that its a dictionary of all the settings. Easily assumed and good to confirm. I\u2019ll dig into the Category class in a moment, but first lets quickly cover the last row of the function. The category attribute of the articles metadata is simply updated with the first item in the categories list (categorie must be a list because it can be indexed.) class Category() This class is the only class defined by the plugin (which is only 96 lines of code). It has 6 methods, 5 of them are decorated, and it has no constants. The decorators are property [3], _name.sett [1] and [1]. URLWrapper is imported from and I don\u2019t know what that does beyond \u201cwrapping URLs\u201d. @property Decorators are functions that takes methods or functions as inputs. Using property along with setter decorators lets a class have a property assigned to it whilst ensuring that arbitrary conditions or logic is upheld. If the @property decorator is over a method called foo, then there would need to be a decorator called foo.setter on a method somewhere in the class. That doesn\u2019t seem entirely right though, because in our Category class, we have a @property decorator over a _name method, and also a @_name.set decorator over another method called _name. But the other methods with @property decorators (slug and ancestors) do not have any associated setter decorators or methods. The setter for _name seems to create parent categories if the string contains slashes: @_name.set def _name(self val): if '/' in val: parentname val = 1) self.paren = self.setti else: self.paren = None self.short = val.strip( Here, self.paren becomes an instance of the category class, that is instantiat using parentname and self.setti This is recursive to however many levels of subcategor are specified. The ancestors and as_dict methods seem more confusing. ancestors isn\u2019t called or mentioned within the class definition but is called from the function which is called after the get_catego function returns. I don\u2019t understand why it needs an @property decorator though. The class inherits from URLWrapper so that is probably the next best place to look\u2026 Indeed, looking at the definition of URLWrapper shows that the as_dict method is overriding the definition in the base class.I guess it\u2019s the \u201ccurse of knowledge\u201d \u21a9"},{"title":"Different Views For Different\u00a0Users","category":"Technical/Web","url":"different-views-for-different-users.html","date":"19 January 2021","body":"This blog serves a variety of purposes. It\u2019s partly a journal of how I\u2019m teaching myself to be a developer and a data scientist, and it\u2019s also a personal blog, with articles about my interests and experience It\u2019s unlikely that anyone is interested in every type of article, and I\u2019d like to make it easy for people to only read the content they\u2019re interested in. Therefore I thought I would separate the articles into two broad groups, technical and non-techni If you visit this blog for the first time by clicking a link to a technical article, the site will then only show you the technical articles on the blog. It\u2019s the same for non-techni articles. If you want, you can change these settings by clicking the paw icon1 in the navbar on the blog index page. I did this mainly because I could. I like playing around with the blog. The JAM stack feels accessible and its fun working with tailwind and with jQuery. I think that playing (being curious, lightheart and unhurried and not being concerned with failure) is really important. Especially for adults who don\u2019t usually do it much. Most of my successes or big opportunit have been the result of a process that started with playing around. Here is the list of requiremen I used when adding the feature: Requiremen If user lands on a page and DOESNT have local setting - create local setting based on type of article being read If user lands lands on a page and DOES have local setting which is contradict - reset local setting to \u201call\u201d If user lands on index and DOES have local settings, only show articles that match the setting Steps: Index page: check if local storage option exists, print to console the result Index page: make button group Index page: make correct button active on page load by using localstora Index page: update active button on page click Index page: articles when button clicked Index page: if local storage does exist, respect it Index page: add 3 stage switch to hamburger menu Index page: make hamburger menu behave intuitivel on small screens Index page: if local storage does not exist, pop up a modal asking for a choice Article page: check if local storage option exists, print to console the result Article page: if local storage doesn\u2019t exist, create it according to article type Article page: if local storage does exist and is contradict update article type to all It\u2019s a paw because cat\u2019s have paws and cat is like category. I might change this to something more intuitive in future, like making the icon an N if the user is only seeing non-techni posts, T for technical, and A for all posts. \u21a9"},{"title":"3 Different Types Of Programming\u00a0Problems","category":"Technical/Web","url":"different-types-of-problem.html","date":"18 January 2021","body":"Three categories of problem Last year when I was creating moneybar and pippip there were a few problems that took much more effort to solve than all the others. I think I could group problems into 3 buckets, based on how much time they take to solve. Type 1 takes less than 15 minutes to solve, type 2 takes between 15 and 45 minutes to solve, and type 3 takes more than 45 minutes (usualy much more). Type 3: When I start learning a hard thing (like web developmen almost everything is in the third bucket and it\u2019s exhausting You need to set aside big chunks of time, you need to be focussed and undistract calm and wide awake, and you need to be prepared for a long arduous journey. Probably your criteria for success should be \u201cam I dead?\u201d because then if you\u2019re asking the question you\u2019re guaranteed to be successful and keeping morale high is necessary for success. Type 2: Hopefully you can make good progress understand the basics and internaliz the relevant abstractio and your problems quickly1 become type 2 problems. They each take from 15 to 45 minutes to solve. Maybe this is because you know enough to break some big general problem into smaller problems (you are developing domain expertise) and your intuitions for how to solve the problem are becoming better so your first or second attempts are likely to be correct, rather than your fifth or sixth. Knowing how to google a problem so that you get the answer you need is also a really important skill, which requires intuiting how an English speaking expert would ask the question. This isn\u2019t trivial but I don\u2019t hear people discussing this often. When most of my coding problems are type 2, it feels like I\u2019m learning most efficientl and when I\u2019m most productive Type 1: After a while, the problems that need to be solved become type 1 problems. They take less than 15 minutes to solve, because: All the big problems have been solved and now you\u2019ve only got smaller problems left, and Your intuitions are good and your expertise has increased and you know where to look for answers.3 Exceptiona problems: But there seems to be a consistent exception to this model.4 Let\u2019s be silly and call them type W problems. These are the problems that eat up far too many hours, and are tiring to solve, even when you are (in most other respects) an expert. For me, these tend to relate to blob storage solutions for web apps deployed into production I can think of several factors why this is so, and I\u2019ll describe the specifics before generalisi When a web app runs in production the data is not stored on the web server because the things that make a web-server cheap and efficient are not the things that make a database or a file storage bucket cheap and efficient. Therefore they are stored somewhere else and you need some plumbing to join everything together. There are some abstractio involved to make this work easily and securely. However when developing locally, you are doing everything on your laptop. You have a web-server relational database and file system all in the same place. This is a big, fundamenta architectu difference between your developmen environmen and your production environmen As a general rule, these are supposed to be as similar as possible. These difference make it much easier to make something that works locally but doesn\u2019t work in production and it\u2019s very hard to test if a thing will work in production without deploying it to your staging environmen which you are likely less familiar with than your local developmen setup. Deploying to staging and debugging on staging is slower and harder than doing the same thing locally. Logging (and filtering) will likely be more important. Solving exceptiona problems So how do you solve these problems quickly and efficientl What is it about this problem that makes it so hard? Let\u2019s examine what makes the problem difficult to solve: Iteration cycles are slow - I can\u2019t test locally, I have to deploy to staging and this takes time. The problem occurs in a \u2018high friction\u2019 environmen - its difficult to dig around and figure out what\u2019s really going on when its hidden below 3 different layers of abstractio on a remote machine that I have limited access to via a web browser. I want to be able to dig and investigat quickly and easily using the same tools I use for writing and testing code locally. I\u2019ve taken great efforts to set up my local developmen environmen so that I can do this, and its stressful to switch to a different and more limited set of tools. The problem is the result of several things interactin at once, and I can\u2019t just test things one at a time. These things are probably very similar to the abstractio Thinking clearly, learning, buidling, solving problems, all rely on being able to separate or untangle a seemingly complex situation into its component parts so that you can figure out what causes what. If you can\u2019t isolate individual concerns or components you have a black box that is keeping you ignorant. In web developmen customized logging is usually a good way to being isolating and exploring particular components Having said all that, I think the best way to solve a problem is to prevent it from occurring in the first place, but I\u2019m not good enough to figure out how to do that, yet.on which timescale? Life is long, does it really matter if it takes 1 week or 1 month to learn something meaningful Momentum, and having fun, is important though. \u21a9from a personal growth point of view. I suppose from an employers point of view they want all problems solved fast, type 1 problems. \u21a9Open the right file, google the right query (and follow the link to stack overflow), make some changes, run your static type checker and linter, run your tests, and push. Done and on to the next item. \u21a9which is totally fine. It\u2019s just a mental model, and the map is not the territory \u21a9"},{"title":"Why I Want To Write\u00a0Regularly","category":"Non-technical/Learning","url":"why-i-want-to-write-regularly.html","date":"18 January 2021","body":"I\u2019ve started writing more frequently I want to do this because I often have thoughts which I\u2019d like to explore and develop further but rarely do. Writing forces me to organise my thoughts and look at how substantia they really are, or aren\u2019t. There is truth in the saying that \u201cto know a thing you need to be able to teach it\u201d 1 , and writing well has several similariti to teaching. Can I really copy a collection of thoughts from my head to yours? Powerful ideas are resilient and have many consequenc The older I get the more I believe that ideas matter2. They have so many subtle consequenc They are the first dominos. I don\u2019t expect writing regularly to become a permanent habit - it doesn\u2019t need to be. But I do want to focus on it for awhile so that I become significan better. It\u2019s a skill that has too many benefits to be ignored. The blogs I remember most are focussed and unapologet about their priorities Most of them have a lot of text and do not focus on design. They make it easy to read content and don\u2019t spend time or attention on header images or styling. Before I redesigned this blog I had default settings that asked me to supply an image for each post, and for a summary, and a suggested tweet. None of it was necessary and whilst they all tried to make the blog better they ended up making it harder to write. These peripheral features added complexity and distracted from the main thing. They\u2019re are still there if I want to use them but they are not set up to be used by default anymore. They\u2019ve been moved to the background and if I forget they exist then that\u2019s OK - it just shows they weren\u2019t as important in practice as I thought they would be. I was probably just having fun adding new features and working out how to build them. You Ain\u2019t Gonna Need It, mate.Wikip article, and some external validation \u21a9A compliment notion is that asking the right question is more important than finding the right answer. I guess asking the right question is always necessary, but finding the right answer is only sometimes sufficient Sometimes you can get the answer a bit wrong if you asked the right question, and still get enough benefits to avoid the problem. \u21a9"},{"title":"Python: Becoming A Better Python\u00a0Developer","category":"Technical/Developer Tools","url":"becoming-a-better-python-developer.html","date":"18 January 2021","body":"I\u2019ve been subscribed to Aaron Maxwell\u2019s \u201cPowerful Python\u201d newsletter for over a year and I really like it. His emails are opinionate and candid, and singularly focussed. He seems passionate about what he does and I like\u00a0that. Ultimately the emails are designed to drive sign-ups for his courses which I suspect would be very good, but there is a lot of value in the free emails. Thanks Aaron. I realised that the emails are sequential and each subscriber gets the same sequence of messages regardless of when they signed up. There is the \u2018first\u2019 message, and then the \u2018second\u2019, and they kind of progress and\u00a0flow. This means that there are more benefits to paying attention than for usual email subscripti Even though the emails arrive when I\u2019m at a supermarke or making dinner for my kids, it\u2019s good to try and read it After being subscribed for several months, I unsubscrib and resubscrib Now that I know how reliable and high quality this advice is I\u2019m going to prioritise working through the examples and doing some of what I missed the first time. I\u2019ve gone back to the beginning to reinforce the parts I know and to try again with what eluded me the first\u00a0time Three kinds of practice projects to become a A web app - use Django if you don\u2019t know which framework to user. Done\u00a0this. A command line tool - use the argparse module, because it\u2019s in the standard library. Haven\u2019t done this yet, I guess now is a good time to start. It seems like the simplest and quickest of the three kinds of project, and I can see how useful it could be - it lets you use the app in many different contexts, outside the python eco-system and anywhere command line tools can be A machine learning model - I\u2019ve already studied this, from theory (numpy) to frameworks (tensorflo I\u2019m happy to see it\u2019s\u00a0inclu"},{"title":"Using Vim with large\u00a0codebases","category":"Technical/Developer Tools","url":"vim-for-large-projects.html","date":"15 January 2021","body":"I use Vim as my text editor and IDE. I like that its free, open source and customizab Below are some of the most useful plugins and features I\u2019ve started using this year when I was building Moneybar and learning how to use\u00a0Django There\u2019s a copy of my .vimrc at the\u00a0end. I\u2019m happy to invest time and effort learning how to make the most of Vim and its plugins. I\u2019m confident that I\u2019ll still be using it twenty years from\u00a0now. Filetype plugins - if you want some settings to be active only for particular filetypes, like .py (python) or .txt (text) then create a file in Vim will look in this file when it opens a buffer of the correspond file type. Good for formatting options like line length, tab spaces, vim commands that are You can\u2019t activate plugins in these files though. All the plugins have to be activated in your .vimrc in the usual\u00a0way. - this plugin lets you runs tests without leaving vim. You can run the test that\u2019s nearest the cursor, or all the tests in the current buffer. It\u2019s very customizab I wish it could be a bit faster, but I could probably improve that myself by changing some\u00a0setti - The incredible Asyncronou Linting Engine (A.L.E) applies fixers and linters to various filetypes, when you want and how you want. Super useful for writing tidy code and catching mistakes before the code is\u00a0run. junegunn/f and - It took a little getting used to at first, but now I can\u2019t imagine not using a tool like this (this could be said about so many vim-relate things). Use fzf to switch between open buffers, open a new file, search for files using the filename, or search within all the files in the project for specific\u00a0t - This plugin opens a sidebar which contains a list of of functions and classes and methods (tags). You can use it to see which methods a class contains, and jump to the part of the buffer where a tag is\u00a0defined This is my .vimrc during January\u00a020 \" ========== Global ========== set nocompatib \" always put it at the top of .vimrc. effects mappings, undo, etc. set encoding=u \" utf-8 encoding set termguicol set t_Co=256 \" number of colors set noerrorbel vb t_vb= \" no error bells, yes screnflash set linespace= set scrolloff= \" minimum number of screen lines above and below the cursor set shortmess- \" show how many times a search result occurs in current buffer, and index of current match set hidden set number relativenu \" Line numbers set splitbelow set splitright \" set tabstop=8 softtabsto expandtab shiftwidth smarttab set undofile \" Maintain undo history between sessions set \" put all the undo files in this dir filetype on \" enables filetype detection filetype plugin indent on \" detection on, plugin on, indent on. To see the current status, type: :filetype syntax on \" syntax highlighti - try 'syntax on/enable' set noesckeys \" might break stuff, should make delay smaller set timeoutlen \" timeoutlen is used for mapping delays set ttimeoutle \" ttimeoutle is used for key code delays set incsearch ignorecase smartcase hlsearch highlight Search guibg=purp guifg='NON highlight Search cterm=none ctermbg=gr ctermfg=bl highlight CursorColu guibg=blue guifg=red highlight CursorColu ctermbg=re ctermfg=bl nnoremap // nnoremap # #`` nnoremap * *`` \" close buffers properly go to previous buffer, then delete the buffer you were just in. nnoremap bd :bp\\|bd # inoremap bd :bp\\|bd # \" Spell check set spelllang= nnoremap ss :setlocal spell! nnoremap sf z=1f :call Flash() ve :e $MYVIMRC vr :so $MYVIMRC \"+y if set \" copy to the system clipboard if \" X11 support set endif endif \" Go into NORMAL mode inoremap jk \" view working directory nnoremap pw :cd %:p:h \" toggle line wrap nnoremap lw :set nowrap!ln :set \" Insert current datetime nnoremap dt A ()hh \" map w to ` nnoremap ` w \" Swap : and ; nnoremap ; : nnoremap : ; vnoremap ; : vnoremap : ; \" Navigation & movemement \" save buffer if it has been changed nnoremap ww :update \" save all changes nnoremap wa :wa \" close buffer nnoremap qq :bp\\|bd # nnoremap wq # \" switch buffers nnoremap + :bn nnoremap _ :bp \" Split navigation nmap h nmap j nmap k nmap l nmap ww nmap wq \" split (pane) resize nnoremap :resize +2 nnoremap :resize -2 nnoremap :vertical resize +2 nnoremap :vertical resize -2 \" open help in vertical split by default cabbrev vhelp vert help \" Natural cursor movement over wrapped lines nnoremap j gj nnoremap k gk \" Insert blank lines in normal mode nnoremap o ok nnoremap O Oj \"========= PLUGINS ========== call \" numbers as text objects Plug \"run shell commands async in vim8\" Plug let = 10 \" When using :python or :!python, access the packages in venv \" \" Plug \" force quickfix to be full widtth au FileType qf wincmd J \" testing - many languages and test runners Plug let test#strat = let = 'pytest' let = '-x' let = \"belowrigh nnoremap tn nnoremap tf :TestFile< nnoremap ts :TestSuite nnoremap tl :TestLast< nnoremap tg :TestVisit \" toggle the quickfix window function! if copen 15 setlocal else cclose endif endfunctio nnoremap cc :call \" generates an index (or tag) file of language objects found in source files \" jump to definition \" jump back \" g] see a list of multiple matches \" Plug \" (re)genera tags file in the bg Plug let = ['.json', \" sidebar that displays the tags of the current file, ordered by their scope Plug nnoremap nnoremap \" add python library code to tags file, goto def with let pyEnvLib = $VIRTUAL_E let pyEnvLib .= \" Async linting engine Plug let = 0 let = 0 \" ALE completion let = 0 set let = 1 nnoremap at :ALEToggle nnoremap af :ALEFix aj :ALENext ak \" iSort Plug \" track the snippets engine Plug \" Snippets are separated from the engine. Add this if you want them: Plug \" Trigger configurat Do not use if you use let let let \" If you want :UltiSnips to split your window. \" let Plug Plug Plug nnoremap x :YcmComple GoTo \" the subcommand add entries to Vim's 'jumplist' so you can use \" 'CTRL-O' to jump back to where you were before invoking the command (and \" 'CTRL-I' to jump forward; see ':h jumplist' for details) let = 0 let let = 1 let = 1 let = 1 let = 1 let = 1 let = 1 \" autoclose parens, brackets etc \" Plug \" vim-tmux focus events Plug \" Code folding \" Plug \" match m of n \" Plug \" adds vertical lines to easily show indent levels Plug \" Fugitive Plug \" Marks Plug \" Latex Vimtex Plug let g:tex_flav = 'latex' autocmd Filetype tex set updatetime let = 'open -a Preview' let = \\'specifie changed to'.\"\\n\". \\'You have \\'Missing number, treated as zero.'.\"\\n \\'There were undefined \\'Citation %.%# \\'Double space found.'.\"\\ let = 8 \" Rainbow parenthesi let blacklist = ['html', 'md', 'wiki'] autocmd BufWritePr * if &ft) < 0 | Plug let = 1 let g:rainbow_ = { \\'guifgs': ['green', 'magenta1' 'gold', 'red', \\'guis': \\} \" Set color scheme. set Plug \" colorschem colorschem badwolf let = 1 let = 1 let = 1 \" colorschem modificati highlight Comment ctermfg=cy guifg=cyan highlight pythonComm ctermfg=cy guifg=cyan highlight LineNr ctermfg=cy guifg=cyan hi nontext term=bold ctermfg=Cy guifg=#80a gui=bold hi vimLineCom term=bold ctermfg=Cy guifg=#80a gui=bold \" SpecialKey - use :set list to toggle visibility of EOL, CR, etc hi specialKey term=bold ctermfg=Cy guifg=#80a gui=bold \" colors for flashing cursorline and cursorcolu hi CursorLine cterm=NONE ctermbg=gr ctermfg=bl guibg=gree guifg=blac hi CursorColu cterm=NONE ctermbg=gr ctermfg=bl guibg=gree guifg=blac \" query what kind of syntax is this color? - wc nnoremap wc :echo \"hi<\" . . '> trans<' . .\"> lo<\" . . \">\" \" fuzzy file, buffer, tag finder set \" ensure you have the latest version Plug { 'do': { -> fzf#instal } } Plug nnoremap e :Files nnoremap r :Buffers t :Tags nnoremap ff :Rg \" nnoremap ff :Ag nnoremap la :BLines ll :Lines nnoremap ' :Marks nnoremap fh :Helptags< nnoremap fs :Snippets< nnoremap fc :Commits fb :BCommits< nnoremap hh :History h: :History:< nnoremap h/ :History/< \" let = --info=inl \" let --files --hidden\" let = 0 let g:fzf_layo = { 'down': '~50%' } \" let = '' let = 'right:0%' function! let joined_lin = join(a:lin \"\\n\") if len(a:line > 1 let joined_lin .= \"\\n\" endif let @+ = joined_lin endfunctio let g:fzf_acti = { \\ 'ctrl-t': 'tab split', \\ 'ctrl-x': 'split', \\ 'ctrl-v': 'vsplit', \\ 'ctrl-o': \\ } let g:fzf_colo = \\ { 'fg': ['fg', 'Normal'], \\ 'bg': ['bg', 'Normal'], \\ 'hl': ['fg', 'Comment'] \\ 'fg+': ['fg', 'CursorLin 'Normal'], \\ 'bg+': ['bg', 'CursorLin \\ 'hl+': ['fg', 'Statement \\ 'info': ['fg', 'PreProc'] \\ 'prompt': ['fg', \\ 'pointer': ['fg', 'Exception \\ 'marker': ['fg', 'Keyword'] \\ 'spinner': ['fg', 'Label'], \\ 'header': ['fg', 'Comment'] } \" grep in vim - shows results in a split window Plug \" session tracking Plug \" pairs of handy bracket mapping Plug \" Plug \" repeat commands from plugin mappings Plug \" vinegar Plug let = 3 \" CSV Plug \" nerdtree Plug nnoremap n let let = 1 \" Automatica delete the buffer of the file you just deleted let \" 2 - open nerdtree only if directory was given as startup argument let \" always focus file window after startup let \" Status bars Plug Plug let = 1 let = 0 let = 0 let let \" remove encoding status let = 1 let let = 1 let = 1 let = 1 let = 1 let = 1 let = 0 let = 0 let = 0 let = 0 let = \" comments Plug let = 1 let = 1 let = 'left' let = 0 let = 1 \" markdown. tabular is required Plug Plug let = ['python=p let = 0 let = 0 let = 1 let = 0 let g:tex_conc = \"\" let = 1 let = 4 let = 1 \" writing prose Plug Plug augroup pencil autocmd! autocmd FileType wiki,md,tx call pencil#ini autocmd FileType wiki,md,tx :PencilSof augroup END let = 'soft' autocmd! User GoyoEnter autocmd! User GoyoLeave \" Ensure :q to quit even when Goyo is active function! s:goyo_ent let b:quitting = 0 let = 0 autocmd QuitPre let b:quitting = 1 cabbrev q! let = 1 q! setlocal wrap endfunctio \" Quit Vim if this is the only remaining buffer function! s:goyo_lea if b:quitting && bufnr('$') == 1 if qa! else qa endif endif endfunctio autocmd! User GoyoEnter call autocmd! User GoyoLeave call nnoremap g :Goyo \" python linting \" F7 checks flake8 Plug Plug \"Flagging Unnecessar Whitespace highlight BadWhitesp ctermbg=re guibg=dark Plug let = ['latex', 'html'] let = 1 let = [] \" javaScript Plug let = 1 let = 1 \" format .JSON files by using the jq cli tool com! JQ %!jq \" HTML/JINJA Plug Plug \" Plug let = \"*.html, *.xhtml, *.phtml\" call plug#end()"},{"title":"Using RSS","category":"Non-technical/Learning","url":"using-rss.html","date":"14 January 2021","body":"Updated: 10 Feb 2021 I found a blog post which is surprising similar to my thoughts on RSS feeds, but better presented and thought through. The post mentions the idea that \u201cRSS is about capturing the long tail of blogs that don\u2019t post frequently 1. This idea crystalise why I was so glad I\u2019d started using RSS feeds again. If readers use RSS, then authors don\u2019t need to concern themselves with attention. This removes pressure on the author to post frequently and lets them focus on quality over quantity. News feeds and ad supported platforms have fundamenta different mechanics and incentives With RSS I can let good quality content come to me, on its own schedule. I don\u2019t need to remember to look for it, and the authors don\u2019t need to remind me that they exist. Google Reader RSS is a very effective way of having good quality informatio come to you. Back in 2008, I used to use Google Reader to subscribe to RSS feeds. I was an aspiring photograph back then and I remember being subscribed to around 80 blogs. Each day I\u2019d read articles from whoever had posted something new, without needing to visit their websites or remember who they are or that I\u2019d subscribed to their blog. The authors didn\u2019t need to optimize their output according to an opaque and changing algorithm either - they didn\u2019t need to optimize article length, tags, post frequency, image inclusion or linked content. They could write how they wanted to, which I suspect leads to higher quality content. Social Media A few years later Google Reader was closed down, presumably because using RSS didn\u2019t fit with Googles advertisin model. I was unaware of it at the time but I imagine it sent shockwaves through blogging communitie and probably upended many businesses I mostly stopped reading blogs. Facebook was growing fast, Instagram felt new and exciting, and content was moving onto \u2018platforms or into walled gardens. And as they kept on growing the average quality of the content decreased. Twitter is like this now I think. There are some real diamonds to be found from time to time, but there\u2019s a lot of mud too. Mostly its just mud, but the occasional diamond can have outsized benefits. RSS isn\u2019t like this. I choose the contents of my \u2018news feed\u2019, and each article can be much longer than a Tweet, or a caption to a photo, or a status update. It\u2019s hard to write well and to create an interestin or useful blog post, and that makes it harder to dilute quality with entertaini distractio I have complete control about what content I see, and I can change it whenever I want. The process is designed around me. Reeder5 I used netNewsWir for a few weeks, but it couldn\u2019t sync between my laptop and phone, so I bought Reeder 5. It\u2019s got a few unusual design patterns, but it works well and has all the features I want. I\u2019ve been unsubscrib from email newsletter and subscribin to the RSS feed instead. It keeps my inbox quieter, and it feels good to have a \u2018separatio of concerns\u2019. It makes it easier to read interestin content without being \u21a9"},{"title":"Notes on learning\u00a0Django","category":"Technical/Web","url":"learning-to-django.html","date":"14 January 2021","body":"Table of Contents In the\u00a0beginn A personal The best\u00a0momen In the\u00a0beginn I came to web developmen via business analytics. I was working as an accountant and Excel wasn\u2019t good enough anymore, so I looked around for a way to get started and came across Jupyter Notebooks. Notebooks are said to be a kind of \u201cgateway drug\u201d to programmin and I think that\u2019s true. They\u2019re the easiest and fastest way to start programmin that I\u2019ve come\u00a0acros When you\u2019re working in a notebook, its easy to get data, wrangle it, and show some results. But as soon as you can create a chart or some summary table you inevitably wonder how you can show this to people more easily, and publishing the results to a website feels like the best most general and Unfortunat it\u2019s also the hardest, and so begins a long series of compromise and incrementa progress. Learn to use a dashboardi API, and learn to create static sites. But the end-goal, the ultimate solution, is a data driven web app, with saved user preference scalable performanc and automatica updated data\u00a0sourc A personal When I moved to the Netherland I wanted to use a personal finances dashboard to check weekly expenses. There wasn\u2019t a web-app that would do this (though there are a couple of apps that are trying) so I built my own dashboard. Then a few friends asked if they could use it too. They couldn\u2019t because it was just a dashboard and not a web app, but I thought this was a good reason to jump into It was a much bigger task than I anticipate (And that\u2019s OK.) It took several attempts and was super frustratin I would dabble for a few weeks, do a few tutorials, and then get completely lost when I tried to do something by myself. I\u2019d get disorienta working across many different files and trying to visualise which part of the model, or the cycle I was currently working\u00a0on I came to realise that the mental load seems so large at the beginning because is really a whole stack of technologi and abstractio combined (or stacked) together. Many of these have to be used together at the same time before you can see any evidence of success at all. I think the hardest things about Django are not actually Django. You\u2019ll need to comfortabl with classes and inheritanc You\u2019ll also need to be comfortabl with working across multiple files, and have some tools for searching across all you open buffers, or all the files in the project, at the same time. You\u2019ll also need to be comfortabl with version control (Git) and using the command line. Get familiar with stack traces\u00a0too If you\u2019re familiar enough with all these things, so that using them doesn\u2019t feel new, but ideally feels familiar and comfortabl then I think you\u2019ll make quite quick progress with\u00a0Djang Django uses the model. Models are how django maps Python objects to items in your database (oh yeah, you need to be familiar with SQL too\u2026), Views are where requests are processed (also Middleware and turned into Responses, which are then combined with templates (unless your building an API). You might notice I haven\u2019t mentioned what a Controller is - get used to informatio feeling incomplete whilst you\u2019re learning the ropes. It\u2019ll become clear soon\u00a0enoug The best\u00a0momen The \u2018curse of knowledge\u2019 states that once you\u2019ve learnt something you can\u2019t imagine or remember what it\u2019s like to not know it. Before that happens completely I want to record some of the \u2018ahah!\u2019 moments of For context, I stopped working as a freelance data scientist in April and after a few weeks wondering if django and PostgreSQL and python was the way to go (yes it is. use boring technology I began working full-time on what would become MoneyBar.n I called it \u2018myeuros\u2019 in the\u00a0beginn The learning curve felt steep. I wanted to do things \u201cright\u201d the first time because I wasn\u2019t building a toy, and although I felt that hindsight would show this to be a mistake in terms of efficiency I did it anyway because I have a hunch that following my compulsion sometimes makes life harder in the short term and better in the long\u00a0term. The best moments are usually preceded by the Adding a unique identifier to an existing I used pydanny\u2019s template. Honestly, by the time I\u2019d gone through the quickstart process and googled the nouns in all the questions (what is Sentry, what is Celery and what is a task que, what is whitenoise etc.) I was already tired. Play with it a few times and come back to\u00a0it. Anyway, I wanted to start with because the project template has that part kind of up and rnuning for you out of the box. uses the Django Allauth package, which is awesome, and reliable, and fully featured\u2026 and extremely abstracted Good luck looking at the module code and understand it if youre not an\u00a0expert. I wanted to give each user a unique ID - a UUID when they signed up. This would be used in query strings instead of usernames or incrementa keys. This was so hard the first time! And it turns out its not a trivial task, not if you already have a few users in your (test) database. Sure you can reset the database and start again, but experiment like this is fairly complex. Understand how the python model classes (the ORM) maps to the relations in the PostgreSQL databse was complex, and if I got confused, should I try to fix it by changing python Models, or editing migrations or working on the database directly? Getting started is one of the After I\u2019d figured out I started creating models for other simpler data (transacti and bank accounts I expect). This was much simpler and faster. I remember driving home one evening thinking that if I could get this far then success Testing\u00a0co Before long, testing each part of the app by hand when I added or changed a feature was no longer trivial. I needed to find some way of automatica creating users and checking that they could log in and access\u00a0vie I began working with pytest, and really found it hard to wrap my head around the idea accessing different parts of the app not by requests and responses but by accessing class I think its normal and good to code at the limit of your knowledge, where you know just enough to make a thing \u201cwork\u201d. But this approach falters when you want to then test what you wrote. Or at least, the measure of \u201cjust enough\u201d really changes when you require tests to be written. You don\u2019t just need to make it work, you need to understand why it works, so that you can write tests to assert that certain conditions pass and others\u00a0fai This feels really satisfying when it works, because you have proof that you really have grasped a bigger picture. There are far fewer (relevant) black boxes when you write tests. But it also makes learning slower, at least in the short term. It means you might have two get comfortabl with a handful of abstractio when you\u2019ve already solved the problem you started with. This is frustratin and it takes discipline to slow down, take a deeper look at the solution, and not just race on to the next\u00a0featu"},{"title":"Data Science vs Web Development: Larger Code\u00a0Bases","category":"Technical/Developer Tools","url":"larger-code-bases.html","date":"14 January 2021","body":"Code\u00a0Struc One of the most immediate and basic difference between working as a data scientist or as a web developer is the number of files the codebase is spread across and the amount of code within each\u00a0file. Web applicatio tend to be very modular - there are a lot of different things going on in a modern web app and generally they all need to be able to be modified or updated independen of each other. This requiremen encourages modular code base architectu with the code broken down into When working on a data science project you often have a well defined and quite narrow pipe line. Each stage of a pipeline has well defined inputs and\u00a0output This seems to have the consequenc of making data science projects tend towards a handful of files each with a substantia amount of unique (not boilerplat code. In web developmen there seems to be more boilerplat many more files spread across a tree of directorie and the average number of lines of code per file is IDE\u00a0featur These difference mean that code organizati tools and IDE features play very different roles within each industry. In web developmen you really need to be able to jump between different files (or buffers) quickly, and search for text across multiple files. Writing idiomatica becomes more important, and writing code within discreet testable units becomes essential so that things don\u2019t break without being\u00a0noti In data science, linting feels more optional, and searching for text within methods or functions outside the current module is\u00a0rarer. I didn\u2019t appreciate this until I paused my work as a Data Scientist and began building non-trivia web\u00a0apps."},{"title":"Test Driven\u00a0Development","category":"Technical/Developer Tools","url":"test-driven-development.html","date":"6 January 2021","tags":"python, django, testing, web-app ","body":"Test Driven Developmen was mind-bendi when I first grappled with it: \u201cWrite a test for the code before you write the code\u201d \u201cAssert that your code matches your expectatio by understand all the inputs and all the outputs for every function or method I write\u201d. Last summer I was building a web app and began to break things when adding new features. This soon led to lots of clicking around different pages to test if stuff was still working each time I made an update. This led to me thinking there must be a better way, which eventually brought me to Test Driven Developmen (TDD). It should have just led me to writing tests, which it did. But googling whatever I googled got me down the TDD rabbit hole rather than just the \u201cwrite some tests\u201d rabbit hole. Write tests for your code before you write the code. Write tests for bugs you\u2019ve fixed to check they stay fixed. Write tests as a kind of documentat to show what stuff is supposed to be doing. Errr\u2026 Django was a big enough pile of abstractio as it was. Views, ORMs, mixins, serializer Trying to add factories and fixtures into that took some getting used to. But eventually I made some progress, and now I quite enjoy running coverage reports to keep coverage close to 100%1. Some of the main things I\u2019ve learnt about writing tests: Use PyTest as much as possible rather than other testing libraries - its assert statements are more intuitive than Django\u2019s own testing framework, and you can use it in any Python codebase, not just Django. It has lots of extensions and seems good at getting the job done fairly easily. Write tests as you go. I haven\u2019t (yet) reached the elevated level of writing tests before I write the code to be tested, though I see why that would sometimes be useful. I do think writing tests sooner rather than later is best though, ideally as soon as you\u2019ve got a basic version of your feature working. Use Coverage to show you which code is covered by your tests, and which branches or edge cases are not. But be warned, it doesn\u2019t tell you if the test is useful or not, only that it passes and which methods or functions it uses. Fixtures are great for keeping tests fairly DRY. Freezegun is great for testing anything to do with dates and times. Static type checkers, like Mypy, get more attractive in proportion to codebase complexity and size. Which is fun and all, but testing for the sake of it doesn\u2019t necessaril stop bad things happening. Its very possible to write a test that covers the code you\u2019ve just written without ensuring that only the intended behaviours happen. \u21a9"},{"title":"Why Talk About\u00a0Jesus?","category":"Non-technical/Other","url":"faith-in-jesus.html","date":"5 January 2021","body":"Start with happiness It seems to me that I am much happier as a Christian than I would be if I weren\u2019t. By \u201cbeing a Christian\u201d I mean following Jesus - trying my best to act, think and speak like he want\u2019s me to because I\u2019m grateful that he did what I believe he did. I believe he died for my sins and is now alive, having been miraculous resurrecte by his father1 (who is now my father too). It\u2019s not just an intellectu exercise. I believe that Jesus is alive because I seem to have experience his companions and interventi in my day-to-day life. This is unusual, mind boggling, and makes things significan more complicate than if I thought he were dead. Nonetheles it seems to be true. It\u2019s not like I see supernatur interventi every day or anything, but there have been various times - too many to discard - when my prayers have been answered in practical ways (I\u2019ll leave out the intangible for now) that have surprised me and given me a lot of respect for the risen Jesus doing what its said he\u2019d do in the bible. Obvious answers There was this time I was in Yemen trying to get back to my room across town, at night and during a storm. I was lost and couldn\u2019t read Arabic to understand which bus to get. So I prayed, and two buses later I got off at the right stop. Another example, I was getting a haircut in Liverpool Street Station in London, and as I was sitting in the chair someone took my work bag. When I came to pay, I realised my laptop and wallet had gone. Trying to find a stolen bag in central London feels ridiculous Even so, I spent a few hours wondering around the alleys and parks looking for it, and I resigned myself to some awkward conversati and using most of my next salary to replace the stolen computer. To my surprise, my parents-in prayed about it and were really confident it would come back to me, which seemed super unlikely. A few days later I received a call from an office worker in the Gherkin building - my bag was under his desk and he wanted to know if I\u2019d like to collect it. I guess I\u2019ll try not to make the same mistake twice. The other example I tend to remember happened not long after I\u2019d first moved to Vienna. I was feeling lonely and isolated and I was wondering how on earth I was going to find some sort of normal that was healthy and sustainabl Try as I might, I wasn\u2019t enjoying things at all and was feeling stressed and overwhelme I remember walking up this steep hill in the 18th district towards my office, and repeatedly praying this really simple prayer \u2018God, please help, please help, please help..\u2019. It was that simple because I couldn\u2019t think of anything more useful to say. Nothing dramatic happened that day, or even that week as far as I can remember. But when I look back, it was a turning point when things started getting better instead of worse. I suppose you can call this last example an intangible answer to prayer. Maybe it is, but I think anyone whose grappled with overwhelmi loneliness or panic would say that the emotions become all too tangible at times. Being in a different emotional state seemed to make a tangible difference to just about every area of life. Eating, productivi at work, relationsh with friends and colleagues etc. This is only the beginning of why I think Jesus is alive and why I think Christiani is a real and living faith. It\u2019s not primarily a tradition, a worldview or a set of rules and ideals. Christiani is a relationsh with Jesus. He did lots of amazing things that have let me have a very practical relationsh with him. It\u2019s an almost unbelievab premise from which to live a life. It has so many implicatio And it holds up to scrutiny and my experience bear it out. Why write this? Despite Jesus\u2019 incredible works and their implicatio modern Christiani seems to be in a really confused and ineffectiv state. Ideas and thoughts about have become mixed up with cultural christiani or christian politics and traditions These are each different things, and unless we distinguis between them with the words we use, we are going to find it hard to think and communicat clearly. I suspect that we are in a negative cycle of imprecise thinking leading to imprecise articulati which leads to further imprecise thinking. Unless we can talk and think about one thing at a time, atomically if necessary, we take on the additional risks of reaching the wrong conclusion personally or arguing with others due to rather than actual disagreeme We should try to create a more precise vocabulary to navigate our Christian lives, so that we can think clearly about the experience and questions we have. Imprecise thinking is frustratin and conversati are less effective when there\u2019s an increased risk of disagreeme or what someone else means. This makes it harder to talk about our faith, which makes talking less common, and this creates room for or apathy, or missed opportunit or sadness.An many other things also. \u21a9from first principals \u21a9"},{"title":"API Design\u00a0Principles","category":"Technical/Developer Tools","url":"api-design.html","date":"4 January 2021","body":"Some super brief notes I made about API\u00a0design Background It\u2019s more of an art than a\u00a0science RESTful State Transfer) API design is an Alternativ API architectu SOAP (Simple Object Access Protocol) is a heavier\u00a0st GraphQL - doesnt overfetch. Graph query language made by\u00a0Faceboo APIs are everywhere (not just web APIs). They\u2019re an abstractio that hides an Django model managers are an API (and also part of Django\u2019s ORM), JavaScript is an API,\u00a0etc. RESTful\u00a0AP Web APIs (all REST APIs?) expose a databases to\u00a0clients A rest api is a URL route (endpoint) that returns JSON or XML. POST, GET, PUT, PATCH, DELETE, correspond to Create, Read, Update/Mod Delete (HTTP methods correspond to CRUD\u00a0metho HTTP METHODS: PUT (create or update) is idempotent POST is not idempotent (keep on PATCH - partial\u00a0up GET, HEAD, OPTIONS and TRACE methods are idempotent cos they are only designed for DELETE HEAD - almost identical to GET, but without any body. Good for checking what a request would return, i.e. Before downloadin a large amount of\u00a0data, OPTIONS - returns data describing what other methods and operations the server supports at the given URL. More loosely defined than other\u00a0verb Use HTTP verbs to make requests Use sensible resource names. Naming things is hard, so think about this a bit before starting. Use identifier in your URLs, not the query string. Good: /users/123 Poor: Use the hierarchic structure of the URL to imply the structure of the API. Design (names and structure of things) for the user/clien not for the database. Resource names should be nouns not\u00a0verbs Use plurals consistent not collection verbiage. Good: customers/ Use camel case or snake Short is better than long, but be\u00a0clear Spend time on design before writing\u00a0co Use HTTP response codes to Prefer JSON over XML. (Hotline does HTML..) XML requires schemas for validation and namespaces Don\u2019t support this complexity at the beginning (or ever) unless required. If it is required, make the XML as similar to JSON as\u00a0possibl Put links in the HTTP link header, or use a JSON representa of\u00a0this. Use the HTTP location header to contain a link on resource creation, or for GET with pagination use first, last, next,\u00a0prev Connectedn - return links in the response which link to useful resources. At minimum, a link to show how the data was received, or\u00a0posted. Idempotenc - clients making the same repeated requests create the same result on the server side. I.e. making repeated requests has the same result as making a similar request, server side. On the client side, a response code may change, of\u00a0course."},{"title":"Principles Of Object Orientated\u00a0Programming","category":"Technical/Developer Tools","url":"principles-of-oop.html","date":"4 January 2021","body":"I recently interviewe for a lead developer role at Lab Digital1 and thought it would be sensible to review some of the fundamenta aspects of Object Orientated Programmin (OOP). You might think that\u2019s a unusual way to prepare for an interview, and you\u2019d be right. Nothing close to these notes arose during the interview, but I find this stuff interestin If I\u2019m motivated enough to study it, then I think that\u2019s a good enough reason by itself, without a specific reason. These are some brief notes. Object Orientated Programmin has four key aspects: Encapsulat (Hiding informatio Abstractio (Hiding the Inheritanc Polymorphi 1. Encapsulat Each object keeps its state private, inside a class. Instance are kept private and accessor methods are made public. Other objects don\u2019t have direct access to this state. They can only call a list of public functions (methods). The object manages its own state via methods, no other class can touch it unless explicitly (not default) allowed. Private variables. Public methods. You can define classes within classes, and functions within functions. 2. Abstractio A natural extension of encapsulat A concept or idea that is not associated with any particular instance. Expresses the intent of the class, rather than a specific Programs are often extremely large and separate objects communicat with each other a lot. This makes maintainin large programs difficult, and abstractio tries to solve this. Applying abstractio means that each object should only expose a high-level mechanism for using it. This mechanism should hide internal implementa details. It should only reveal operations relevant for the other objects. This mechanism should be easy to use and should rarely change over time. Implementa changes \u2014 for example, a software update \u2014 rarely affect the abstractio you use. e.g. a coffee machine. It does a lot of stuff and makes quirky noises under the hood. But all you have to do is put in coffee and press a button. 3. Inheritanc In OOP, objects are often similar, sharing similar logic. But they are not 100% the same. Create a (child) class by deriving from another (parent) class. This way, we form a hierarchy. child class reuses all fields and methods of the parent class (common part) and can implement its own unique part using method or attribute overloadin 4. Polymorphi Gives a way to use a class exactly like its parent so there\u2019s no confusion with mixing types. But each child class keeps its own methods as they are. This typically happens by defining a (parent) interface to be reused. It outlines a bunch of common methods. Then, each child class implements its own version of these methods. Any time a collection (such as a list) or a method expects an instance of the parent (where common methods are outlined), the language takes care of evaluating the right implementa of the common method \u2014 regardless of which child is passed. I\u2019d like to be so familiar with the following features that I can use them without referring to notes: Getters and setters. Instance methods compared to class methods. Inheritanc mixins, and decorators The \u201cmagic\u201d within the Django source code that requires mypy to use extensions in order to do its static type checking correctly. Unfortunat I didn\u2019t get the job. They wanted a senior Python developer with experience with Infrastruc As Code, and also working at an agency. Can\u2019t win them all. \u21a9"},{"title":"Optimizing The Performance Of This\u00a0Blog","category":"Technical/Web","url":"site-performance.html","date":"4 January 2021","body":"I\u2019m coming to the end of redesignin this site. Now that the main changes have been made its fun (and good practice) to optimize the site so that it loads quickly and is optimized for SEO Lighthouse is a utility built into Chrome that runs a technical audit on a webpage and assesses a wide range of features. It also provides details about how to improve the\u00a0page. My site is hosted on Github Pages and is accessed via Cloudflare which gives me a lot of performanc gains including minified HTML and CSS, caching, and super fast server I\u2019m using Github Pages and Cloudflare for free and I think its amazing that I can get the benefits of these services without needed to pay anything. If someone knows where to look and can teach themselves using free resources, they could be read by anyone anywhere in the world. It\u2019s\u00a0amazi Below are the lighthouse results for the blog\u2019s index page and for a recent\u00a0pos"},{"title":"Unix: Utilities To Analyse And Update Multiple Text\u00a0Files","category":"Technical/Developer Tools","url":"using-unix-utilities-to-analyse-and-update-multiple-files.html","date":"4 January 2021","body":"As part of the redesign of this blog I wanted to make an article\u2019s category more meaningful Previously I simply picked a handful of categories and then assigned a single category to each post. This method becomes limiting when an article is relevant to Also, using nested categories seems like a good way of grouping similar content and allowing more nuanced filtering of\u00a0interes As I considered how to update the categories of existing articles, I realised this would be a good opportunit to practice analyzing and updating text files using Here is how I reviewed and updated the categories of my I use Pelican to generate the static files for this site. It converts markdown into HTML. Metadata for each article is set at the beginning of a file, the title is set by typing Title: ... and similarly the category is set by typing Category: ... on its own\u00a0line. To locate, analyse and update my existing categories I would therefore need to find all the markdown files which have a row that begins with Category: grep -h \u2018Category: **/*.md - prints each search\u00a0res grep -h \u2018Category: **/*.md | sort - prints and sorts each search\u00a0res grep -h \u2018Category: **/*.md | sort | uniq -c prints and sorts each search result, then counts how many occurrence of each unique result there\u00a0are. I had some repeat results though because some rows had white space at the end, so in order to make these the same, I needed to remove grep -h 'Category: **/*.md | sed | sort | uniq -c This gave me the 6 Category: 2 Category:D 16 2 2 15 15 8 Category:T Category is repeated and isn\u2019t\u00a0need grep -h 'Category: **/*.md | sed | sort | uniq -c | sort | sed This gives me the following output, which is\u00a0accepta 2 Data 2 Engineerin 2 Front-end 6 8 Tools 15 General 15 Startups 16 New\u00a0Catego The next stage was to begin updating these categories with the new, nested categories I\u2019ve decided to try splitting the categories into technical and I can imagine splitting Technical > Data even more in future, perhaps having Data Analytics, Data Science, and Data Engineerin as Technical Data Web Cryptocurr Not technical Family Self Career I cd into the directory containing the markdown files, and then to change all the articles with Category: Tools to Category: Tools I\u00a0did: grep -l 'Category: Tools' *.md | xargs sed -i 's/Categor Tools/g' If I want to see a list of files containing Category: General: grep -H 'Category: General' *.md If I want to see just the file names,\u00a0the grep -l 'Category: General' *.md Update Since writing this post I\u2019ve modified the categories a few times. The commands I run to switch out categories are as\u00a0follows export export newName=Li grep -l \"Category: | xargs sed -i '' Notes: Double quotes are not the same as single quotes. You need to use them if you want to access variables or commands inside a\u00a0string. .* is a wildcard operator allowing any number of characters It\u2019s required when an article belongs to"},{"title":"A New Blog\u00a0Design","category":"Technical/Web","url":"a-new-blog-design.html","date":"18 December 2020","body":"The blog has a new design! Out with the old, and in with well-writt HTML, an improved CSS framework, maintainab code, dark mode, and articles with This website was my first ever project using HTML and CSS, and the codebase for the original blog was terrible. It was poorly written and hard to maintain. I remember when I was first building it and trying to figure out what a
or a really\u00a0was At times I felt like little more than a monkey randomly bashing keys, hitting save and refreshing the browser tab. I felt guilty for spending any non-essent time away from my wife and daughter. I wondered if any benefits would actually materializ that would outweigh the costs of not rushing home to take care of a new-born and relieve a tired and stressed\u00a0m It took awhile, but eventually this blog became the most effective force multiplier I\u2019ve ever\u00a0used. As I\u2019ve learnt more about web developmen the JAM stack has become increasing intuitive and familiar. A side effect was that as I became comfortabl with \u201cgood\u201d dev work, working with this blog\u2019s old code base became increasing uncomforta I wanted to update the blog so that it would be easy and fun to use again. I want to be able to play with it quickly I wasn\u2019t aiming for a radical re-design, I like that the focus is on text and I\u2019m not exploring any on-trend design choices. I think my original design choices have held up well. I want a design that will work for many years, with templates and code that is easy and intuitive to read, and design elements that are easier to work\u00a0with. I hope that I\u2019ll be writing here more regularly over the next few months. It\u2019s been a busy year and there is lots to write\u00a0abou"},{"title":"Product-Led\u00a0Growth","category":"Non-technical/Entrepreneurship","url":"product-led-growth.html","date":"8 December 2020","tags":"marketing, startups ","body":"Part 1: Design your\u00a0strat Chapter 1: Why is product-le growth of Product-Le Growth is a go-to-mark (GTM) strategy that relies on using your product as the main vehicle to acquire, activate and Product-Le Growth (PLG) means that every team influences the product: Marketing - how can our products generate a Sales - how can we use the product to qualify our prospects for\u00a0us? Customer Success - how can we create a product that helps customers become successful beyond our\u00a0dreams By having every team focussed on the product, you create a culture that is built around enduring customer\u00a0v By leading with the product throughout the org, PL companies get shorter sales\u00a0cycl lower Customer Acquisitio Costs\u00a0(CAC higher Revenue Per Employee (RPE) A GTM strategy is an action plan that specifies how a company will reach target customers and achieve a In order to select a GTM, you first need to understand ideal\u00a0cust Knowing these elements will help you choose the correct GTM that will acquire, retain and grow you customer base in the most Sales-Led Profit Centers: Sales, Marketing, Cost Centers: Advantages Annual Contract Value (ACV) can be very\u00a0high Enterprise first solutions that are very complex and therefore need a high-touch sales\u00a0mode If the Total Addressabl Market (TAM) is very small, because your market is super niche, you can quite easily talk to almost all market participan (PLG is built for large\u00a0TAMs It\u2019s great for new categories of product where education is required, because you need to change how people approach a problem. This takes time and This is turn requires that you understand your customers pain points, objections core problems. If you jump to quickly a PLG GTM strategy then you risk a high church rate because you haven\u2019t understood or educated your customers well\u00a0enoug Disadvanta Sales cycles are very\u00a0long The Life Time Value (LTV) must be high enough to recoup the investment in CAC. This often requires charging the customer a premium. This premium price isn\u2019t because the product is amazing, or more valuable to the customer, but because the customer acquisitio model is If you use a Sales-Led GTM, you need to watch out for competitor who can sell more efficientl or have a more efficient Customer Acquisitio Model. They can steal your market share by offering the same product with at a lower\u00a0pric Customer Acquisitio methods are super leaky. Most leads (MQLs) never result in a closed deal, this is partly because: it encourages markets to gate content in order to hit their MQL\u00a0goals it focusses on content consumptio as a leading indicator of intent. (but reading a white-pape or brochure doesn\u2019t mean I\u2019m going to buy the\u00a0thing) the entire process rewards creating friction in the buying\u00a0pro Consequent there is often a disconnect between marketing and\u00a0sales Product-Le Switching from Sales-Led GTM to Product-Le GTM creates a defensive\u00a0 A product led marketing team asks \u201cHow can we use the product to qualify our prospects for\u00a0us?\u201d A product led customer success team asks \u201cHow can we create a product that helps customers be successful without our\u00a0help?\u201d A product-le engineerin team asks \u201cHow can we create a product with a Growth is much faster, because: you need fewer resources to\u00a0scale, the top of the customer funnel is much\u00a0wider CAC is much\u00a0lower Higher RPE (revenue per\u00a0employ User experience is\u00a0better Chapter 2 - Free, Freemium or\u00a0Demo Use the MOAT framework to pick the right GTM\u00a0strate Market Strategy - Dominant? Ocean Conditions - Blue or\u00a0Red? Audience - Do you have a top-down or bottom-up Time to Value - How much time do you need for a customer to experience the\u00a0value? Dominant - Do it better than the competitio and charge a lower\u00a0pric Is your TAM big\u00a0enough Does your product solve a specific job significan better and at lower cost than anyone else on the\u00a0market Can the user realize significan ongoing value quickly with little or no\u00a0help? Do you want to be the undisputed market\u00a0lea Differenti - Pick and win a fight against an industry giant Main line of defense against the giant Focus on an Do a specific job better than the competitio but charge This is not a Free trials and demos work well\u00a0here. Because its specialize combining freemium with quick time-to-va is\u00a0difficu Main competitiv advantage is how you solve your Does your market have Is the TAM big\u00a0enough is the ACV high\u00a0enoug Could prospects experience a Magic Moment during a free\u00a0trial Disruptive - Charge less for an inferior product (e.g. Canva, Google docs) Build a simpler product that solves a specific pain point, and because its simpler, its faster, you can charge less, and its more appropriat to some over-serve or It\u2019s a Costs must be\u00a0low Product must be easy to\u00a0use Is TAM large\u00a0enou Can on-boardin be Freemium model\u00a0thri Chapter 3 - Red-Ocean or Red-ocean companies try to outperform their rivals in order to grab a larger share of the existing market. As the market gets crowded, opportunit for profit and growth reduces. Products become commoditie Cut-throat competitio turns the waters\u00a0red Blue-ocean companies access untapped market space and create demand. They have opportunit for highly profitable growth. Competitio is irrelevant Create and capture new\u00a0demand Some markets will be red-ocean, but a particular niche within it will be blue\u00a0ocean Blue Oceans require educating the customer to create the demand. This is high-touch and often a PL GTM strategy isn\u2019t going to work. It needs to be sales-led in order to educate the customer enough. But if Time-To-Va (TTV) is short then PLG could be\u00a0great. Red Oceans need big wide funnels in order to compete, and PL GTM strategies work great. They\u2019re defensible keep costs low and make sales cycles\u00a0sho Chapter 4 - Top-Down or Bottom-Up PLG works for\u00a0bottom High touch sales-led strategies work for Top-Down enterprise sales strategies where the product is super complex and the sales cyccle is very\u00a0long. Top-Down (Sales team is the High ACV Lower customer\u00a0c Poor High CAC Long sales\u00a0cycl Free Trial Bottom-Up (Product is the Low CAC Predictabl sales\u00a0figu Scalable\u00a0f Small contracts\u00a0 upfront investment as non-paying customers Freemium Free-Trial Chapter 5 How much time until you deliver on your promise to\u00a0prospec How much time until the product sells\u00a0itse PL GTM strategies require Rank your uses across 2 dimensions and group into 4 quadrants. The dimensions are Ability (low - high) and Motivation (low - high) Low Motivation - Low Ability = High Motivation - Low Ability =\u00a0Rookie Low Motivation - High Ability =\u00a0Veteran High Motivation - High Ability =\u00a0Spoiled Figure out your top two quadrants. Unless all your users are Spoiled, you need to Questions: How motivated are your\u00a0users Is your product easy for your target audience to\u00a0use? Can users experience the core value (magic moment?) Part 2 - Build Chapter 7 - Build a A positive feedback\u00a0l Understand your\u00a0value Communicat Deliver on your\u00a0promi Repeat Chapter 8 - Understand your\u00a0value If you\u2019re selling live-chat software, you\u2019re not really selling live-chat software, you are selling a new and better way to You are selling the outcome, the result, the\u00a0why. Pain makes us want to change so that we can avoid or prevent the Pain is There are three reasons why people buy a\u00a0product: Functional Outcome - the core task that needs to get\u00a0done. Emotional Outcome - how a customer wants to feel or avoid feeling as a result of the Social Outcome - how a customer wants to be perceived by others by using your\u00a0produ In every software, there are usage patterns that point towards to core outcomes that are most important to\u00a0custome One of the biggest difference between sales-led and product-le companies is that SL companies monitor usage patters to see if users are accomplish meaningful outcomes. These outcomes are referred to as value-metr A value-metr is the way you measure value exchange in your product. They are the linchpin to successful execution of a product-le GTM strategy because you are aligning your revenue model directly with your customer acquisitio model. Because your value metrics play a vital role in how you price your product, set up your monitoring and build your\u00a0team. Value metrics could\u00a0be: for Vimeo, number of videos uploaded by the\u00a0user. for Slack, number of messages sent by the\u00a0user for PayPal, amount of revenue generated by the There are functional and outcome based value metrics. Functional value metrics are \u201cper user\u201d or \u201cper 100 videos\u201d. Pricing scales around functions of usage. Outcome based value metrics charge based on the outcome, e.g. how many views a video received, or how much money a customer made when using your payment Many SaaS companies rely on feature as a way to justify higher prices, but this produces higher\u00a0chu Value metrics outperform feature with upto 75% less\u00a0churn Outcome based value metrics reduce church by an additional A good value metric is easy for a customer to understand They need to immediatel understand what they\u2019re paying for and where they fit in your structure. If you\u2019re in an establishe market, it makes senses to look at what the A good value metric is aligned with the value that the customer receives through the product. Consider the low-level components of your high-level outcome. E.g. what low level actions are necessary to get the end result? Sending lots of messages? Meeting lots of people? Finding lots of\u00a0things? A good value metric grows with your customers usage of the valuable outcome. If customers get incredible value from the product, charge them more because the product is worth it. Also, if they aren\u2019t getting much value from the product, charge them\u00a0less. Don\u2019t do user based pricing if you want to get lots of users - its a conflict of\u00a0interes If you\u2019re small, you can try different pricing strategies and iterate to\u00a0success Ask\u00a0yourse What do my best customers\u00a0 What do my best customers not\u00a0do? What features did my best customers try first when they What are the similariti along my best customers that led to\u00a0success For churned customers, what were the main difference between them and the best customers? Were they in the ideal audience? Why did they church? What did they do or not do that good"},{"title":"The Mom\u00a0Test","category":"Non-technical/Entrepreneurship","url":"mom-test.html","date":"4 December 2020","tags":"book ","body":"Table of Contents Summary Update\u00a02 Update\u00a01 Chapter 1 - Opinions are\u00a0worthl Chapter 2 - Avoid bad\u00a0data Chapter 3 - Ask the Chapter 4 - Keep it\u00a0casual Chapter 5 - The currencies Chapter 6 - Finding Chapter 7 - Choosing you\u00a0custom Chapter 8 - Getting the most from the conversati by prepping and\u00a0review Summary Ask Avoid bad\u00a0data. Keep it\u00a0casual. Push for commitment Frame the meeting\u00a0we Focus on the right, tight, Prep well, take good notes, review your\u00a0notes Update\u00a02 Doing Mom-test and customer developmen remotely: Update\u00a01 This comment on Hacker News mentions a similar book \u201cThe Right It\u201d by Alberto Savoia. He mentions 8 ways to test an\u00a0idea: The Pretend-to \u2013 Before investing in buying whatever you need for your it, rent or borrow it\u00a0first. The Pinocchio \u2013 Build a \u201clifeless\u201d version of the\u00a0produc The Fake Door \u2013 Create a fake \u201centry\u201d for a product that doesn\u2019t yet exist in any\u00a0form The Minimum Viable Product (or Stripped Tease) \u2013 Create a functional version of it, but stripped down to its most The Mechanical Turk \u2013 Replace complex and expensive computers or machines with human\u00a0bein The Provincial \u2013 Before launching world-wide run a test on a very small\u00a0samp The Re-label \u2013 Put a different label on an existing product that looks like the product you want to\u00a0create. Rob Fitzpatric responds to this comment with more thoughts and links to this comment. Chapter 1 - Opinions are\u00a0worthl Anything involving the future is an overly optimistic lie. You want objective facts about what happened in the\u00a0past. Find out if people care about your idea by never mentioning Forcing yourself not to mention your idea will force you to ask Make sure your questions pass the mom test: Talk about their lives, not your\u00a0idea. Ask about specific objective events in the past, not Spend most of your time listening, not\u00a0talkin Chapter 2 - Avoid bad\u00a0data You aren\u2019t allowed to tell them what their problem\u00a0is They aren\u2019t allowed to tell you what to\u00a0build. Bad data\u00a0inclu Compliment Fluff Hypothetic Ideas Chapter 3 - Ask the You should be terrified of at least one of the questions you ask in Search out the scary questions you\u2019ve been avoiding. What\u2019s the worst thing a prospect could say? What\u2019s the scariest question you could\u00a0ask? A good way to find scary questions is to imagine that your company has failed and then to ask\u00a0why. If you get an unexpected answer to one of your questions and it doesn\u2019t have any effect on what you\u2019re going to do, then was it really worth asking\u00a0it? General advice for hard things includes asking hard questions. Imagine you were delegating the task, what would you tell the person to do? That\u2019s\u00a0you You can ask about\u00a0mone Love bad news - if you find out that your idea is fundamenta flawed you\u2019ve just saved a tonne of time and energy and money. Move on. It\u2019s getting you closer to the\u00a0truth. Bad news isn\u2019t the result of an opinion. No one knows if your idea will work, only the market\u00a0kno Opinions don\u2019t\u00a0coun A lukewarm response to your idea can be a great conversati because you realise that your idea isn\u2019t a great idea. Lukewarm means they don\u2019t care enough to buy the first (worst) version before it\u2019s\u00a0ready Look before you zoom. Don\u2019t focus on details too soon, you need to understand the big picture\u00a0fi You have product risk and customer risk: Product risk - can I build it and grow\u00a0it? Customer risk - is there a big enough group of people who are going to buy\u00a0it? Pre-plan a list of the 3 most important questions (including a scary one) before every meeting or conversati Be ready with these. They should change from week to week. As you get good quality answers to existing questions you can bring in new\u00a0questi Chapter 4 - Keep it\u00a0casual Problem -> Solution ->\u00a0Sales Normally you would have 3 meetings with a client when you make a big sale. This lets you do each stage really well without blurring the data: Identify your Explain your\u00a0solut Sell the\u00a0soluti Identifyin a problem doesn\u2019t have to be a meeting, keep it casual and you will get more honest feedback much faster. It works better as a chat when people are relaxed and saying what they really\u00a0thi It take 5 minutes (max) to identify if a problem exists and is\u00a0importa Give as little info as possible about your idea whilst still nudging the Chapter 5 - The currencies In early stage sales, the main goal is learning. Money is a\u00a0side-eff Will they spend money, time, or reputation on your\u00a0solut If someone is willing to risk reputation or spend time or money on your idea, then you can believe that Hearing a compliment means they\u2019re trying to get rid of\u00a0you. If it\u2019s a bad meeting or you\u2019re not sure what they really think, push for a commitment of some kind. Ask them to spend time, reputation or money and you\u2019ll see what they really\u00a0thi If they aren\u2019t excited (not just interested but in pain and excited for your solution) then you need to find that out ASAP. It\u2019s still a good meeting if you can discover\u00a0t Are you offering pain A lead isn\u2019t a real lead until you\u2019ve given them a concrete chance to reject\u00a0you Ask learning questions by using the Mom Test, then confirm by selling\u00a0it You need crazy customers: They have a painful and They know they have a painful and They have the money to pay you to solve\u00a0it They already have their own bad solution to this terrible problem, and yours is clearly\u00a0be A crazy customer doesn\u2019t say \u201cyeah that\u2019s great, I\u2019m really interested let me know when its ready\u201d. They say \u201cAHHH THIS IS THE WORST PART OF MY LIFE AND I WILL PAY YOU RIGHT NOW TO FIX IT!\u201d A crazy customer will front you the money when all you have is a barely functional prototype made of\u00a0duct-ta A crazy customer is the person reading you blog, searching for workaround when you haven\u2019t spent loads of marketing and Keep a crazy customer close - they\u2019ll stick with you when times are\u00a0tough. Chapter 6 - Finding Vision -> Framing -> Weakness -> Pedestal ->\u00a0Ask Keep having conversati until you stop hearing new\u00a0stuff. If it\u2019s a topic you both care about, find an excuse. You\u2019ll both enjoy the chat. You don\u2019t need to mention your idea if it breaks the\u00a0premis Warm introducti are the ideal way to start a new conversati 6 degrees of freedom in the world, so find someone who knows someone who knows\u00a0them Cold calls - Serendipit - be prepared, be\u00a0bold. Have a good excuse -\u00a0hustle. Landing pages - so that googling the problems brings them to\u00a0you. Organise an event - bring the businesses together for an event. You\u2019ll be considered an expert because you\u2019re Become a subject matter\u00a0exp Speaking and teaching engagement - you get to have strong opinions, and you\u2019ll be\u00a0respect Chapter 7 - Choosing you\u00a0custom Startups don\u2019t starve, they drown. In options, choices, ideas. You Choose a good customer segment, focus on it, and don\u2019t In the beginning: Google - PhD\u00a0studen PayPal -\u00a0eBay. Evernote - Moms with\u00a0recip It will look obvious with hindsight, probably not so obvious before you\u2019ve figured it\u00a0out. If you can\u2019t get a consistent answer to a question, maybe you\u2019re speaking to If you aren\u2019t finding consistent problems and goals then you don\u2019t have a specific enough customer segment. Within the group, which type of person would want it the\u00a0most? Would everyone within the group want to buy/use it? Or only\u00a0some? Why does the subset want\u00a0it? Does everyone in the group have the What other motivation are\u00a0there? Who else (outside the group) has Go for Who-Where pairs of segments. \u201cFinance profession who You want customers who are reachable, profitable and Good segments are usually \u201cwho-where pairs. If you don\u2019t know where to find your customers, keep slicing until you do. Make sure the segment is reachable, profitable and Chapter 8 - Getting the most from the conversati by prepping and\u00a0review Prepare\u00a0we Have your three big questions ready, including the scary\u00a0ones Know who you are speaking\u00a0t Know what commitment and next-steps you want to push\u00a0for. Spend up to an hour writing\u00a0do Your best guesses about what you think they\u2019ll\u00a0sa What they care\u00a0about What they\u00a0want. If you have a focussed segment, you\u2019ll only need to do If you come across a question which can be answered using the internet, use the\u00a0intern Take good\u00a0notes Ask if you can record the\u00a0audio. Record emotions as well as words. Verbatim alone isn\u2019t always accurate 6 months\u00a0lat Use shorthand for follow\u00a0up. Observe and record emotions - happy, angry, meh,\u00a0etc. Pains and obstacles are a lot more important if someone is embarrasse or angry when they\u2019re talking about\u00a0them Dig into the big emotions, find out what\u2019s causing them, or why its a big\u00a0deal. Review your notes Meta - which questions went well or\u00a0not? What were the answers to the How can you do better next\u00a0time? What were the clear signals? What signals did you\u00a0miss?"},{"title":"Obviously\u00a0Awesome","category":"Non-technical/Entrepreneurship","url":"obviously-awesome.html","date":"3 December 2020","tags":"marketing, positioning, startups, book ","body":"Intro In order to do better marketing, understand your value and Understand what makes a Positionin is a fundamenta input to every business tactic you\u00a0use If you fail at positionin you fail at marketing and\u00a0sales Positionin is \u201cconext setting\u201d for\u00a0produc Customers need to be able to easily understand What your product\u00a0do Why it is\u00a0special Why it matters to\u00a0them If your prospects can\u2019t figure out what your product does quickly, they will invent a position for you. It might hide your key strengths or misunderst your\u00a0value Find people to demo your product to, then ask them to describe it back to you. Do the same with existing customers. If they\u2019re not saying the same thing then you have a Products with strong positionin Find the best kind of\u00a0custome Make the value\u00a0obvi Sell\u00a0quick Part\u00a01 Positionin as\u00a0context Without positionin to guide us, we don\u2019t know how to understnad a\u00a0product Positionin lets us make assumption about who a product is\u00a0for what its main features\u00a0a how much it\u00a0costs Without it we would be paralyzed by choice, and we wouldnt be able to make sense of all the products around\u00a0us. The context and purpose of a product from the perspectiv are often different to those of the\u00a0prospe Products can be positioned in multiple ways, and often the best positionin is not the Bad positionin makes it harder for prospects to figure out if your product is worth Positionin requires considerin Customers point of view - what problem/pa does it\u00a0solve The ways your product is The of a The best market context to make your value Five (plus one) components of Competitiv Alternativ - if you didn\u2019t exist, what would your customers use instead? Its what your prospects would use to do the task if your product didn\u2019t exist. It could be excel, or pen and paper. It could be nothing, (in which case maybe you product doesn\u2019t solve a real pain\u00a0point You probably know a lot more about your market, and problem, and alternativ solutions, than your prospects do. (\u201cDoing nothing\u201d could be an opportunit not a red\u00a0flag.) Its important to understand what customers will compare your product with, because this is how they will define \u201cbetter\u201d. Is it \u201ceasier\u201d than a pen and\u00a0paper? Customers might never have purchased a solution like yours\u00a0befo Unique Attributes - What do you have that alternativ do not? Could be: Delivery Model (online vs offline, installed on-site vs\u00a0not) Business model (rental vs\u00a0purchas Specific expertise (data scientist with financial and back-end web dev\u00a0expert Value (and quantifiab objective proof) - What value do the attributes enable for customers? If unique attributes are your secret sauce, the value is why someone might care about your secret\u00a0sau Fact based, provable, demonstrab quantifiab objective. Third party opinions are\u00a0releva Target market - Who cares a lot about that value? Focus on customers who are most likely to purchase quickly, wont ask for discounts, will tell their friends about\u00a0you. You should clearly identify who these are\u00a0and Identity what sets them apart from other groups of customers. Why are they uniquely likely to buy from you when others wouldnt or would take longer to consider and make a\u00a0purchase Market Category - What context makes your value obvious to the ideal customer? Declaring that your product exists in a certain market triggers a powerful set of\u00a0assumpt make these assumption help you, not hinder\u00a0you When presented with your product, customers will try to use what they already know to figure out what your product is all about and why its special. You want them to be able to do this really quickly and\u00a0easily Make these assumption work for you, not against you. You won\u2019t have to list every feature, because most of them will be assumed by the context of the Get it right, and sales efforts (copy) wont be wasted battling those assumption but can instead build off of them to show secret sauce and value (Bonus) Relevant Trends - what trends make your product relevant right now? Used carefully, trends can show prospoects why they should pay attention to your product right now. It can increase urgency, excitement but the trend must be directly relevant, practicall and Blockchain AI, ML are trends - they are relevant to Each of these components is relevant to the\u00a0others"},{"title":"How I learnt to\u00a0code","category":"Non-technical/Learning","url":"How-I-learnt-to-code.html","date":"1 December 2020","tags":"learning, code ","body":"4 years ago I started learning how to code, and it was difficult! It still is difficult, but I now have a collection of tools and perspectiv that make it less daunting. Leveling up requires one more abstractio to wrap my head around, or one more API to understand I can do\u00a0it. But I don\u2019t think it needed to be so difficult, so now I\u2019m building Code School Meta to make it easier to learn how to\u00a0code. Learning to code has been fun, ultimately successful and life changing (hello job security!) but at the beginning it was sooo slow, and super\u00a0tric Imposter syndrome is real, and I definitely felt it. I hadn\u2019t taken any classes in computer science, and I felt like I knew almost nothing The process began with Pandas, a (the) Python analytics library. Spreadshee were slowing me down at work, and I was bored. I found this great tutorial by Brandon Rhodes. From there I found out about Jupyter Notebooks, and then I found about Github pages and how to make a blog using pelican. That led to HTML and CSS (and also JavaScript which I tried very hard to avoid for as long as possible). I felt like a monkey bashing a keyboard as I tried to make HTML elements do what I\u00a0wanted. Unusally, I think Git was next. Mainly because I found this amazing tutorial to help me learn. It made Git seem OK, and also helped HTML and CSS make more sense too.\u00a0Bonus I suddenly realized that great learning materials are crucial if I was going to keep momentum and keep on enjoying the thrill of seeing the computer do something I hadn\u2019t made it do\u00a0before. So now I\u2019m now working on Code School: Meta. It\u2019s an online community to make it easier to teach yourself how to code - less confusion about how to get started or what to learn next, more encouragem and lots of high If you\u2019d like to know more, please check it out and sign up for\u00a0update"},{"title":"","category":"snippet","url":"emacs-dreaming.html","date":"27 November 2020","tags":"emacs, dream ","body":"Two nights ago, I dreamt I was experiment with\u00a0Emacs"},{"title":"The 1-Page Marketing\u00a0Plan","category":"Non-technical/Entrepreneurship","url":"1-page-marketing-plan.html","date":"19 November 2020","tags":"marketing, book ","body":"Introducti Marketing has three\u00a0phas Before (prospect) - get people to know you and During (lead) - get the to like you and buy from you for the first\u00a0time After (customer) - get them to trust you, buy from you regularly, and\u00a0refer new\u00a0custom Marketing is strategy, all the other things are tactics Jargon free definition of\u00a0marketi Advertisin - The circus is coming to town, you paint a sign saying \u201cCircus coming to town on\u00a0Saturda Promotion - Put the sign on the back of an elephant and walk it into\u00a0town Publicity - Walk the elephant through the mayors flower beds and let a newspaper write about\u00a0it Public relations - Get the mayor to laugh about\u00a0it Sales - the town people go to the circus, you explain how much fun the entertainm is, they buy some tickets, you answer their questions, they spend lots of money on food, games,\u00a0sho Marketing is the plan that made the whole thing\u00a0happ"},{"title":"Learning to\u00a0market","category":"Non-technical/Entrepreneurship","url":"marketing-101.html","date":"19 November 2020","tags":"marketing, growth ","body":"Since April I\u2019ve been able to work full time as a solo founder. I\u2019ve challenged myself to build something useful enough that customers would want to pay for it and in the process of seeking that goal I\u2019ve become a much better and more I\u2019ve been working solo for about 7 months now, and in that time I\u2019ve begun using test driven developmen I\u2019ve built non-trivia data driven web apps using Django, and I\u2019ve learnt how to deploy and monitor those apps in production and make I\u2019ve made two main apps; moneybar.n and pippip.ema and I\u2019ve learnt so\u00a0much. I\u2019m realising though, that I still have much to learn in other spheres. Being a great developer is deeply meaningful to me. It\u2019s literally a bucket list item for me and I intend to be writing code as long as I live. But there is no point creating products if no one knows they\u00a0exist This is where marketing and positionin comes in. Right now it feels like I know nothing about how to get users, or validate an idea, or position a product. These are all super necessary and super\u00a0unkn On a more meta level, I\u2019m confronted with the lost benefits of working with co-founder or of having friends doing similar things. I want to work faster and make progress more efficientl I need to be part of a\u00a0communit Working in isolation does have its advantages though. I\u2019m self-taugh and self-direc figuring out the contours of uncharted territory and creating my own personal map. In my mind, I have a deep and almost personal relationsh with the coding abstractio and tools I\u2019ve learnt to work with. Classes and functions, strings and floats, literally have (to me) their own textures, colors and weights when I think or dream about\u00a0them It feels like I can pick up these abstractio as if they were physical objects and turn them around to examine them. Place them next to each other and compare the difference Run thought experiemen In my experience this kind of relationsh and affection simply doesn\u2019t happen when taking a class or following someone elses schedule. It\u2019s satisying to feel ownership of a skill like this, and it\u2019s one of the primary reasons I consider coding to be similar to a\u00a0craft. Having said all that, now that I realise I need to validate my product, position it, and figure out marketing, I\u2019ve stopped writing code, put down my tools, and I\u2019m going to learn marketing. I\u2019ve bought some books. Maybe I\u2019ll post some reviews here\u00a0later In no particular order here is what I plan to\u00a0read: The 1-Page marketing plan, by Allan Dib Obviously Awesome, by April\u00a0Dunf Lean Analytics, Hooked, How to build habit-form products, by Nir Eyal\u00a0The Mom Test, by This is marketing, by Seth\u00a0Godin Product-le growth, by Wes\u00a0Bush I don\u2019t know if I\u2019ve covered 90% of the distance required or 50%. It\u2019s exhausting but I\u2019m here for the journey not"},{"title":"Pippip.Email","category":"Non-technical/Entrepreneurship","url":"pippip.email.html","date":"28 October 2020","tags":"pippip ","body":"PipPip 6 weeks ago I had an idea for a product whilst reading a news\u00a0artic It would be great if the writing and sending of important messages could be separated, so that I could write something long before I needed to send it, and know that then sending would happen at the right time without having to think about it. This could be useful for sending my daughter a message on her 15th birthday, or to my wife on our 10th anniversar I also designed a check-in mechanism, so that messages could be sent if I disappeare or pass\u00a0away. PipPip is the result - event-driv and scheduled email delivery, from days to\u00a0decades We\u2019re working on validating the idea and finding the right"},{"title":"Between\u00a0Clients","category":"Non-technical/Social","url":"between-clients.html","date":"8 October 2019","tags":"freelance ","body":"At the end of the summer I had time between engagement to work on some side projects and grapple with some new (to me) libraries During August and September 2019,\u00a0I: Practiced creating websites using the Investigat and demo\u2019ed - a library that allows exposing Plotly Dash apps to Built a personal finance dashboard using Plotly Dash and began turning it into a web app using Django and Interviewe for a role at CoinMetric and created this investigat as a Created a company \u201cAtlas Consulting Internaton to facilitate life as a freelance data scientist Spent a lot of time working at the cafe in IKEA because my co-working space wouldn\u2019t let me stay late or work on Introduced myself to this new coworking space and suggested we could work together to create the best tech-focus startup hub and coworking space in Bought our first car. It required a lot more time and research than I\u00a0expected Created Texni Data Consultanc with Dan Caputo to provide data strategy Built the website for Texni Data using django and deployed it to Heroku with a custom domain\u00a0nam Deployed a django app to Google Cloud Platform using App Engine. Created a business website to represent my work as a freelance data scientist and made my blog a subdomain of this\u00a0site. Moved my blog off of github pages and onto firebase. Thanks Github for several years of simple trouble free\u00a0hosti Experiment with using storage buckets on Google Cloud Platform to host static sites and serve them over SSL. My conclusion is that serving static sites using storage buckets is great - it\u2019s simple, quick and cheap (free). But adding SSL proved too difficult. I spent too many hours trying to create a loadbalanc that would work for both the root domain johnmathew and also a subdomain In the end I found out about firebase. Firebase is also quick and cheap (free) and simple enough to Created a photo book using PhotoBox that turned out great. It covers the last 3.5 years and is mostly full of snapshots of our kids and selfies with\u00a0Ritsy Chose a primary school for my daughter to go to next\u00a0year."},{"title":"Analysis of the mean and median value of transactions on 5\u00a0Blockchains","category":"Technical/Data","url":"btc-fork-analysis.html","date":"2 September 2019","tags":"bitcoin, blockchain, litecoin, dogecoin, bitcoin cash, bitcoin sv, finance ","body":"This analysis was prepared for Coin Metrics as part of their recruitmen process. It is a short demonstrat of my thought process. The additional steps required to develop this into a useful analysis are CoinMetric Case - to evaluate skills and abilities in multiple\u00a0w Importing\u00a0 Wrangling\u00a0 Exploring\u00a0 Analysis Modeling Provide: A written explanatio of how to approach the\u00a0proble Present the beginning phases of implementa using coin metrics\u00a0da Of the four options made available in the case study, option 3 was\u00a0chosen Advocating for CoinMetric data\u00b6Produ quality research that is of value to potential clients (doesn\u2019t have to be complete) with a particular focus on network\u00a0da Initial ideas\u00b6My first rough ideas\u00a0were Compare different Bitcoin based chains, (BTC, BCH, LTC, BSV) to test the influence of whales and compare this to their respective (evolving) claims to be a store of value (SoV) and/or alternativ to\u00a0cash. Develop and expand some of the research by Willy Woo. I find his research to be outstandin In particular I think the following metrics merit days\u00a0destr hodl\u00a0waves thermo\u00a0cap average\u00a0ca Tracking the number of twitter followers of various crypto-twi thought leaders and celebritie to test the hypothesis that \u201can increase in follower numbers shows that new retail investors are entering and an increase in price is expected\u00a0s Thought leaders / crypto celebritie could be further grouped by what types of coins they speak about most - smart contracts, DeFi, privacy coins,\u00a0etc Weibo could be analysed as well as Twitter to understand Chinese markets, Korean twitter could be analysed for the Korean I have an existing side project which has the goal of using a recurrent neural net using an LSTM architectu to predict BTC price movements. The app (model, stored data, data pipeline, visualizat of results) will run autonomous on Google Cloud Platform. Candle data is consumed from CoinAPI.io and stored in\u00a0BigQuer Technical indicators will be calculated and used as additional factors to the model. Sentiment analysis from news outlets (Bloomberg FT) would be added\u00a0late The model would be written using TensorFlow and the BigQuery tables names would use BQ\u2019s date format capabilite This would make the project faster and\u00a0cheape Idea 1 seemed like a sensible option. Ideas 3 and 4 are interestin and worth investigat but not possible within the scope of this\u00a0exerc Testing the influence of whales\u00b6and \u201cnormal users\u201d on BTC and 4 BTC forks, and discussing results in the context of each chain\u2019s claimed technical advantages and use cases as e.g. a store of value or alternativ to\u00a0cash This will be achieved by comparing daily mean USD transactio value to daily median USD transactio value. This is done by calculatin the mean-media ratio of transactio value (MMR). Hypothesis If a chain has a much smaller median transactio size than mean transactio size, then on chain activity is dominated not by regular users making normal daily transactio but by whales moving large amounts of currency to artificial inflate usage\u00a0metr This could contradict claims that a blockchain has an active user base that the blockchain is meeting user needs. We assume\u00a0tha If a blockchain is functionin as digital cash, then most of its transactio would be small. e.g less than 100 USD. It should be noted that 100 USD is not a particular small amount even in western countries and due to a blockchain borderless nature, it is even futher above a noraml \u2018day-to-da transactio amount in large parts of the\u00a0world. Conversely if a blockchain has relatively little organic use by normal users then whales (users with large holdings) will make up a large proportion of on-chain activity and would have average transactio sizes much larger than a day-to-day transactio An untested guess at a \u201cwhale threshold\u201d could be 100,000USD Where the ratio of mean to median transactio value is relatively high, we have an environmen where the mean value is much higher than the median value, which shows that daily total value transacted is dominated by a few relatively large transactio rather than many small value transactio This would imply that whales dominate the blockchain (and likely market behavior) rather than members of the general public or Chains:\u00b6Th chains that will be analysed here are all forks of BTC. They\u00a0are: BTC BCH BSV LTC DOGE Fields\u00b6usi the coinmetric api, the following metrics will be\u00a0used: The sum USD value of native units transferre divided by the count of transfers (i.e., the mean \u201csize\u201d in USD of a transfer) that\u00a0inter TxTfrValMe The median USD value transferre per transfer (i.e., the median \u201csize\u201d in USD of a transfer) that\u00a0inter In\u00a0[1]: from import HTML function code_toggl { if (code_show } else { } code_show =! code_show } $( document This analysis was made using Python. You can toggle the code visibility by clicking =0);v t&&t>=0);v e=new e=new new a(1);for(v t&&t>=0);v works only with positive a(0),mod:n i=new a(1),o=new a(0),s=new a(0),l=new i,o=new a(1),s=new v[t];var y;else x;else new Error(\"Unk prime \"+t);e=new _}return works only with works only with red works only with works only with red f}}}functi strict\";va _(t){var w=[\"functi k(e,o){var \"+e);var new 0,r)};var n=\"for(var M=f[2*x];v S=f[2*x+1] E=f[2*_];v C=f[2*_+1] L=2*y;var z=2*b;var O=2*w;var I=2*p;var D=2*g;var P=2*d;for( R=0;Rt;){v u(t,e,r,n) l=new EventEmitt memory leak detected. \"+s.length listeners added. Use to increase t}function 0:return 1:return 2:return 3:return t=new instanceof Error)thro e;var l=new \"error\" event. for(var strict\";va new value \"'+t+'\" is invalid for option \"size\"');v e=new e)throw new TypeError( \"string\" argument must be of type string. Received type u(t)}retur t)return new encoding: \"+e);var TypeError( first argument must be one of type string, Buffer, ArrayBuffe Array, or Array-like Object. Received type \"+typeof new to allocate Buffer larger than maximum size: bytes\");re 0|t}functi t)throw new TypeError( \"string\" argument must be one of type string, Buffer, or ArrayBuffe Received type '+typeof t);var 0;for(var d(t,e,r){v new encoding: new must be a ... new TypeError( \"target\" argument must be one of type Buffer or Uint8Array Received type '+typeof t);if(void new of range 0;for(var 0)}var new to write outside buffer new encoding: M(t,e,r){v new to access beyond buffer new argument must be a Buffer new out of new out of new should be a new out of n)throw new must be a new encoding: a}function B(t){retur i}function j(t,e){ret t instanceof V(t){retur strict\";va l(t,e){ret e in strict\";va o(t,e){ret c(m,v,s,f) i=new null;var l(t,v)};va u(t,e){ret strict\";va strict\";va t instanceof Uint8Array instanceof strict\";va strict\";va strict\";va new Error(f+\" map requires nshades to be at least size l(t,e,r){v 0:return 0;case 1:return t[0]-e[0]; 2:return 3:var a;var 4:var t}(o,r)}}; strict\";va strict\";va t)throw new Error(\"Fon argument must be a new Error(\"Can parse an empty new Error(\"Mis required new Error(\"Unk or unsupporte font token: new Error(\"Mis required p(t){var strict\";va Error(\"Unk keyword t}function g(t){for(v a}return a}return strict\";va e=new a=0;a0)thr new Error(\"cwi pre() block may not reference array new Error(\"cwi post() block may not reference array args\")}els new Error(\"cwi pre() block may not reference array new Error(\"cwi post() block may not reference array index\")}el new Error(\"cwi Too many arguments in pre() new Error(\"cwi Too many arguments in body() new Error(\"cwi Too many arguments in post() block\");re strict\";va i(t,e,r){v x=new cwise routine for e=new new return strict\";va e=[\"'use strict'\",\" function (!(\"+l.joi && \")+\")) throw new Error('cwi Arrays do not all have the same {\"),e.push (!(\"+c.joi && \")+\")) throw new Error('cwi Arrays do not all have the same r(t){var r;return n}}}var s(t){retur l(t,e){var c(t,e){var u(t,e){var _(t,e,r){v T(t,e){var n}function n(n){var d,g=new r;var e=[];for(v r in e=[];for(v r in e=[];for(v r in r(t,e){var n in r}function n(){}var v(t){var e;return j(t){retur V(t){retur new Error(\"unk type: r,n,i=new h(){if(r){ o(t){for(v a(t,e){var o(t){retur l(t){var t;var t=[];retur t=[];retur f(t){for(v g(t,e){for b(t,e){var _(t){var 0;var w(e),r=new w(r),n=new X=function t(e){funct s(e){var i(i){retur i(i){var new a}return i(i){var i}function a(e){var u(t){retur f(e){var u=s[e];ret n(t){var +r+\"v\"+ t;var n(t){retur t[0]}funct i(t){retur t[1]}funct a(t,e,r){v a=new 0}function s(t){for(v e}var t,e,r=new this;var 1:do{(o=ne 2:do{(o=ne 3:do{(o=ne t=[];retur C(t,r,i,a) 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 0 1 e}(e);else L(t,e){ret z(t,e){ret O(t,e){ret I(t,e){ret D(t,e){ret P(t){retur R(t){retur F(t,e){var B(t,e){var j(t,e){ret 0});var Y(t,e){ret F(){var v(t,r){var w(t){retur new _(t)}funct k(t){retur t[0]}funct T(t){retur t[1]}funct A(){var l(r){var f(){return S(t,e){ret E(t){retur z(t){funct e(e){retur new L(t(e))}re O(t){var I(){return D(){var O(r())},de O(n())},de O(i())},de O(a())},de F(t){retur B(t){retur N(t){var l(){var a=1;a0)for kt=functio t(e){funct r(t){retur e?new wt(t,e):ne gt(t,0)}re At=functio t(e){funct r(t){retur e?new Tt(t,e):ne mt(t,0)}re St=functio t(e){funct r(t){retur e?new Mt(t,e):ne xt(t,0)}re Ct(t){retu r}function Vt(t,e){re t[e]}funct Ut(t){var Ht(t){for( n}function qt(t){var Gt(t){for( v}return t=N(U);ret a(){var new new new new new new new new new new a(r){var y(){var p(t){retur m(t){retur r};var x(t,e){for r in e;var e=new L;if(t)for c(){var r,n=new i in z(){var O(){var d(){var in g(){var v(){var re(t){retu ne(t){retu ie(t){retu se(t){retu new new Vt;functio ce(t){retu fe(t,e,r){ o;if(i)ret i=!1,a;var Me(){for(v t}function Se(){for(v Oe(t){retu t+\"\"}var n(e){var ir(t){var or(t){for( b=u&&h;ret u(t){var r(r){for(v in u}(e)}};va %b %e %X this.s}};v cr=new lr;functio ur(t,e,r){ zr(t){var Or(t,e){re Rr(t){var Br(t,e){re d(t,o){var u}}functio Jr(t){retu Kr(){var r=e;return A(t,a){ret o(t,e){ret s(t,e){var an(t){var on(t,e){va o(t,e){var c(t){var vn(){var r}function 0 1,1 0 1,1 _n(){var t,e;functi r(r,n){var kn(){var i(t,e){var Tn(t){var r}function An(t){var x(r,i){var En(t){retu t})()}func Cn(e){var A(){return a(t,r,n){v Ln(t){retu i(){var b(){return t,e,r;func n(n,i){var r(e,r){var Wn(t,e){va $n;functio Xn(t,e){va n(t);funct _i(t,e){va n;var s;var wi(t,e){va _i(r,e);va Ti(t,e){re Mi(t){var s,l=1/0;re function n}function $i(t,e){va 1;var n(t,n){var function t(e,r,n,i) n}function Ha(t){retu qa(t,e){re Ga(t,e){re c}function o(t){var a(a,o){var f=function t(e){var t(e){var u(t,e){for d(t,e,r,i) l(t){retur function s(t){retur l(t){retur jo(t){var s(a){var Ho(t){retu qo(t){for( c=2;cAt)+\" \"+e}functi 0,0 \"+n}return n(n,i){var \"+l[2]+\" \"+l[3]}ret function() 0,\"+e+\" \"+e+\",\"+e+ u[n]:delet v(){var if(_){var L(){var e=0;es*l){ a}function o(t){for(v r,n;for(r= t;e||(e=t) e}function s(t){var l(t,e,r,n) c(t,e,r){v n=t;do{var n}function l=t;do{for f(t,e){ret r}(t,e)){v v(t,e){ret b(t,e){ret e){e=0;for strict\";va strict\";va strict\";va strict\";va strict\";va t&&(t instanceof strict\";va strict\";va instanceof n))throw new requires strict\";va strict\";va strict\";va instanceof n))throw new requires new specify data as first t[0][0]){v new Error(\"sou length \"+c+\" does not match destinatio length must be defined\"); \")+\"px new o(t){retur new s(t,e){ret new t=[];retur t=[];retur 1:return function t(e,r){var 2:return function function new new new new this.tree; e=new new Error(\"Can update empty node!\");va r=new new U=v,H=w,A= HALF_PI) && (b 2) ? + delta, option) : // option 3-n: round to n directions (option == 2) ? + delta, hv_ratio) : // horizontal or vertical\\n (option == 1) ? rawAngle + delta : // use free angle, and flip to align with one direction of the axis\\n (option == 0) ? : // use free angle, and stay upwards\\n (option ==-1) ? 0.0 : // useful for backward compatibil all texts remains horizontal rawAngle; // otherwise return back raw input isAxisTitl = (axis.x == 0.0) &&\\n (axis.y == 0.0) &&\\n (axis.z == 0.0);\\n\\nv main() {\\n //Compute world offset\\n float axisDistan = position.z vec3 dataPositi = axisDistan * axis + offset;\\n\\ float beta = angle; // i.e. user defined attributes for each tick\\n\\n float axisAngle; float clipAngle; float flip;\\n\\n if (enableAli {\\n axisAngle = (isAxisTit ? HALF_PI :\\n dataPositi + axis);\\n clipAngle = dataPositi + alignDir); axisAngle += 0.0) ? 1.0 : 0.0;\\n\\n beta += flip * PI);\\n }\\n\\n //Compute plane offset\\n vec2 planeCoord = position.x * mat2 planeXform = scale * mat2(\\n cos(beta), sin(beta), -sin(beta) cos(beta)\\ );\\n\\n vec2 viewOffset = 2.0 * planeXform * planeCoord / //Compute clip position\\n vec3 clipPositi = //Apply text offset in clip coordinate clipPositi += 0.0);\\n\\n //Done\\n gl_Positio = highp GLSLIFY 1\\n\\nunifo vec4 color;\\nvo main() {\\n gl_FragCol = highp GLSLIFY 1\\n\\nattri vec3 vec3 mat4 model, view, vec3 vec3 vec3 main() {\\n\\n vec3 signAxis = sign(bound - vec3 realNormal = signAxis * normal;\\n\\ enable) > 0.0) {\\n vec3 minRange = min(bounds bounds[1]) vec3 maxRange = max(bounds bounds[1]) vec3 nPosition = mix(minRan maxRange, 0.5 * (position + 1.0));\\n gl_Positio = projection * view * model * 1.0);\\n } else {\\n gl_Positio = }\\n\\n colorChann = highp GLSLIFY 1\\n\\nunifo vec4 vec3 main() {\\n gl_FragCol = colorChann * colors[0] +\\n colorChann * colors[1] +\\n colorChann * p=new o=[];funct vectorizin text:\"'+t+ s;var r=0;rr)thr new If resizing buffer, must not specify u(t,e){for new Cannot specify offset when resizing r-1;return n.create() void null;var highp highp GLSLIFY 1\\n\\nvec3 v) {\\n // Return up-vector for only-z vector.\\n // Return ax + by + cz = 0, a point that lies on the plane that has v as a normal and that isn't (0,0,0).\\n // From the above if-stateme we have ||a|| > 0 U ||b|| > 0.\\n // Assign z = 0, x = -b, y = a:\\n // a*-b + b*a + c*0 = -ba + ba + 0 = 0\\n if (v.x*v.x > v.z*v.z || v.y*v.y > v.z*v.z) {\\n return v.x, 0.0));\\n } else {\\n return v.z, -v.y));\\n }\\n}\\n\\n// Calculate the cone vertex and normal at the given index.\\n// The returned vertex is for a cone with its top at origin and height of 1.0,\\n// pointing in the direction of the vector Each cone is made up of a top vertex, a center base vertex and base perimeter vertices.\\ These vertices are used to make up the triangles of the cone by the following: segment + 0 top vertex\\n// segment + 1 perimeter vertex a+1\\n// segment + 2 perimeter vertex a\\n// segment + 3 center base vertex\\n// segment + 4 perimeter vertex a\\n// segment + 5 perimeter vertex a+1\\n// Where segment is the number of the radial segment * 6 and a is the angle at that radial segment.\\n To go from index to segment, floor(inde / 6)\\n// To go from segment to angle, 2*pi * To go from index to segment index, index - d, float rawIndex, float coneOffset out vec3 normal) {\\n\\n const float segmentCou = 8.0;\\n\\n float index = rawIndex - floor(rawI /\\n (segmentCo * 6.0)) *\\n (segmentCo * 6.0);\\n\\n float segment = floor(0.00 + index/6.0) float segmentInd = index - normal = if (segmentIn > 2.99 && segmentInd 0.99 && segmentInd 4.99 && segmentInd max(a, b)) || \\n (p 0 U ||b|| > 0.\\n // Assign z = 0, x = -b, y = a:\\n // a*-b + b*a + c*0 = -ba + ba + 0 = 0\\n if (v.x*v.x > v.z*v.z || v.y*v.y > v.z*v.z) {\\n return v.x, 0.0));\\n } else {\\n return v.z, -v.y));\\n }\\n}\\n\\n// Calculate the cone vertex and normal at the given index.\\n// The returned vertex is for a cone with its top at origin and height of 1.0,\\n// pointing in the direction of the vector Each cone is made up of a top vertex, a center base vertex and base perimeter vertices.\\ These vertices are used to make up the triangles of the cone by the following: segment + 0 top vertex\\n// segment + 1 perimeter vertex a+1\\n// segment + 2 perimeter vertex a\\n// segment + 3 center base vertex\\n// segment + 4 perimeter vertex a\\n// segment + 5 perimeter vertex a+1\\n// Where segment is the number of the radial segment * 6 and a is the angle at that radial segment.\\n To go from index to segment, floor(inde / 6)\\n// To go from segment to angle, 2*pi * To go from index to segment index, index - d, float rawIndex, float coneOffset out vec3 normal) {\\n\\n const float segmentCou = 8.0;\\n\\n float index = rawIndex - floor(rawI /\\n (segmentCo * 6.0)) *\\n (segmentCo * 6.0);\\n\\n float segment = floor(0.00 + index/6.0) float segmentInd = index - normal = if (segmentIn > 2.99 && segmentInd 0.99 && segmentInd 4.99 && segmentInd max(a, b)) || \\n (p strict\";va highp GLSLIFY 1\\n\\nattri vec3 position, vec4 mat4 model, view, float vec4 vec3 main() {\\n vec4 worldPosit = model * vec4(posit 1.0);\\n worldPosit = (worldPosi / + vec4(capSi * offset, 0.0);\\n gl_Positio = projection * view * fragColor = color;\\n fragPositi = highp GLSLIFY 1\\n\\nbool a, float b, float p) {\\n return ((p > max(a, b)) || \\n (p u||ru)thro new Error(\"gl- Parameters are too large for FBO\");var new Error(\"gl- Multiple draw buffer extension not new Error(\"gl- Context does not support \"+f+\" draw buffers\")} new Error(\"gl- Context does not support floating point g=!0;\"dept new i:throw new Error(\"gl- Framebuffe a:throw new Error(\"gl- Framebuffe incomplete o:throw new Error(\"gl- Framebuffe incomplete s:throw new Error(\"gl- Framebuffe incomplete missing new Error(\"gl- Framebuffe failed for unspecifie null;var new Error(\"gl- Can't resize FBO, invalid null;var max(a, b)) || \\n (p FLOAT_MAX) {\\n return vec4(127.0 128.0, 0.0, 0.0) / 255.0;\\n } else if(v max(a, b)) || \\n (p 0){for(var max(a, b)) || \\n (p max(a, b)) || \\n (p max(a, b)) || \\n (p 0.25) {\\n discard;\\n }\\n gl_FragCol = f_color * f_uv) * highp GLSLIFY 1\\n\\nattri vec3 vec4 id;\\n\\nuni mat4 model, view, vec3 vec4 f_id;\\n\\nv main() {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n f_id = id;\\n f_position = highp GLSLIFY 1\\n\\nbool a, float b, float p) {\\n return ((p > max(a, b)) || \\n (p max(a, b)) || \\n (p t&&r>0){va 1}function M(t){var S(t){var E(t){var C(t){var null;for(v function() u=0;u=0){v mediump GLSLIFY 1\\nattribu vec2 vec2 uv;\\nvoid main() {\\n uv = position;\\ gl_Positio = vec4(posit 0, mediump GLSLIFY 1\\n\\nunifo sampler2D vec2 uv;\\n\\nvoi main() {\\n vec4 accum = 0.5 * (uv + 1.0));\\n gl_FragCol = strict\";va m(t){var null}retur new Error(\"web not q(){for(va c=0;c 1.0) {\\n discard;\\n }\\n baseColor = color, step(radiu gl_FragCol = * baseColor. mediump GLSLIFY 1\\n\\nattri vec2 vec4 mat3 float vec4 vec4 main() {\\n vec3 hgPosition = matrix * vec3(posit 1);\\n gl_Positio = 0, gl_PointSi = pointSize; vec4 id = pickId + pickOffset id.y += floor(id.x / 256.0);\\n id.x -= floor(id.x / 256.0) * 256.0;\\n\\n id.z += floor(id.y / 256.0);\\n id.y -= floor(id.y / 256.0) * 256.0;\\n\\n id.w += floor(id.z / 256.0);\\n id.z -= floor(id.z / 256.0) * 256.0;\\n\\n fragId = mediump GLSLIFY 1\\n\\nvaryi vec4 main() {\\n float radius = length(2.0 * - 1.0);\\n if(radius > 1.0) {\\n discard;\\n }\\n gl_FragCol = fragId / strict\";va e;function r(e,r){ret e in instanceof instanceof null;var strict\";va in a)return a[t];var max(a, b)) || \\n (p max(a, b)) || \\n (p max(a, b)) || \\n (p max(a, b)) || \\n (p max(a, b)) || \\n (p 1?1:t}func O(t,e,r,i) n=0;n0){va new c(t,r,a)}; S=new t=0;t=0){v new n(\"\",\"Inva data type for attribute \"+f+\": new n(\"\",\"Unkn data type for attribute \"+f+\": \"+h);var new n(\"\",\"Inva data type for attribute \"+f+\": a};var new i(\"\",\"Inva uniform dimension type for matrix \"+name+\": new i(\"\",\"Unkn uniform data type for \"+name+\": \"+r)}var new i(\"\",\"Inva data new data type for vector \"+name+\": c(e){for(v n=[\"return function n=[];for(v i in r){var new i(\"\",\"Inva data new i(\"\",\"Inva uniform dimension type for matrix \"+name+\": \"+t);retur o(r*r,0)}t new i(\"\",\"Unkn uniform data type for \"+name+\": p}function f(t){var r=0;r1){l[ c=1;c1)for l=0;l 0 U ||b|| > 0.\\n // Assign z = 0, x = -b, y = a:\\n // a*-b + b*a + c*0 = -ba + ba + 0 = 0\\n if (v.x*v.x > v.z*v.z || v.y*v.y > v.z*v.z) {\\n return v.x, 0.0));\\n } else {\\n return v.z, -v.y));\\n }\\n}\\n\\n// Calculate the tube vertex and normal at the given index.\\n// The returned vertex is for a tube ring with its center at origin, radius of length(d), pointing in the direction of d.\\n//\\n// Each tube segment is made up of a ring of vertices.\\ These vertices are used to make up the triangles of the tube by connecting them together in the vertex array.\\n// The indexes of tube segments run from 0 to 8.\\n//\\nve d, float index, out vec3 normal) {\\n float segmentCou = 8.0;\\n\\n float angle = 2.0 * 3.14159 * (index / vec3 u = vec3 v = d));\\n\\n vec3 x = u * cos(angle) * length(d); vec3 y = v * sin(angle) * length(d); vec3 v3 = x + y;\\n\\n normal = return vec4 vec4 color, vec2 uv;\\nunifo float float mat4 model\\n , view\\n , projection , vec3 eyePositio , vec3 f_normal\\n , , , f_data\\n , vec4 vec2 f_uv;\\n\\nv main() {\\n // Scale the vector magnitude to stay constant with\\n // model & view changes.\\n vec3 normal;\\n vec3 XYZ = * (tubeScale * vector.w * position.w normal);\\n vec4 tubePositi = model * 1.0) + vec4(XYZ, 0.0);\\n\\n //Lighting geometry parameters vec4 = view * /= = lightPosit - f_eyeDirec = eyePositio - f_normal = * // vec4 m_position = model * 1.0);\\n vec4 t_position = view * gl_Positio = projection * f_color = color;\\n f_data = f_position = f_uv = : highp GLSLIFY 1\\n\\nfloat x, float roughness) {\\n float NdotH = max(x, 0.0001);\\n float cos2Alpha = NdotH * NdotH;\\n float tan2Alpha = (cos2Alpha - 1.0) / cos2Alpha; float roughness2 = roughness * roughness; float denom = * roughness2 * cos2Alpha * cos2Alpha; return exp(tan2Al / roughness2 / vec3 vec3 vec3 float roughness, float fresnel) {\\n\\n float VdotN = 0.0);\\n float LdotN = 0.0);\\n\\n //Half angle vector\\n vec3 H = + //Geometri term\\n float NdotH = H), 0.0);\\n float VdotH = H), 0.000001); float LdotH = H), 0.000001); float G1 = (2.0 * NdotH * VdotN) / VdotH;\\n float G2 = (2.0 * NdotH * LdotN) / LdotH;\\n float G = min(1.0, min(G1, G2));\\n \\n //Distribu term\\n float D = //Fresnel term\\n float F = pow(1.0 - VdotN, fresnel);\\ //Multiply terms and done\\n return G * F * D / max(3.1415 * VdotN, a, float b, float p) {\\n return ((p > max(a, b)) || \\n (p 0 U ||b|| > 0.\\n // Assign z = 0, x = -b, y = a:\\n // a*-b + b*a + c*0 = -ba + ba + 0 = 0\\n if (v.x*v.x > v.z*v.z || v.y*v.y > v.z*v.z) {\\n return v.x, 0.0));\\n } else {\\n return v.z, -v.y));\\n }\\n}\\n\\n// Calculate the tube vertex and normal at the given index.\\n// The returned vertex is for a tube ring with its center at origin, radius of length(d), pointing in the direction of d.\\n//\\n// Each tube segment is made up of a ring of vertices.\\ These vertices are used to make up the triangles of the tube by connecting them together in the vertex array.\\n// The indexes of tube segments run from 0 to 8.\\n//\\nve d, float index, out vec3 normal) {\\n float segmentCou = 8.0;\\n\\n float angle = 2.0 * 3.14159 * (index / vec3 u = vec3 v = d));\\n\\n vec3 x = u * cos(angle) * length(d); vec3 y = v * sin(angle) * length(d); vec3 v3 = x + y;\\n\\n normal = return vec4 vec4 vec4 id;\\n\\nuni mat4 model, view, float vec3 vec4 f_id;\\n\\nv main() {\\n vec3 normal;\\n vec3 XYZ = * (tubeScale * vector.w * position.w normal);\\n vec4 tubePositi = model * 1.0) + vec4(XYZ, 0.0);\\n\\n gl_Positio = projection * view * f_id = id;\\n f_position = highp GLSLIFY 1\\n\\nbool a, float b, float p) {\\n return ((p > max(a, b)) || \\n (p null;var strict\";va r-1}return n.create() t-e});for( max(a, b)) || \\n (p 0.0) ||\\n clipBounds discard;\\n vec3 N = vec3 V = vec3 L = {\\n N = -N;\\n }\\n\\n float specular = V, N, roughness) 0.);\\n float diffuse = min(kambie + kdiffuse * max(dot(N, L), 0.0), 1.0);\\n\\n //decide how to interpolat color \\u2014 in vertex or in fragment\\n vec4 surfaceCol =\\n .5) * vec2(value value)) +\\n step(.5, vertexColo * vColor;\\n\\ vec4 litColor = surfaceCol * vec4(diffu * + kspecular * vec3(1,1,1 * specular, 1.0);\\n\\n gl_FragCol = mix(litCol contourCol contourTin * highp GLSLIFY 1\\n\\nattri vec4 uv;\\nattri float f;\\n\\nunif vec3 mat3 mat4 model, view, float height, sampler2D float value, kill;\\nvar vec3 vec2 vec3 eyeDirecti vec4 main() {\\n vec3 dataCoordi = permutatio * vec3(uv.xy height);\\n = objectOffs + vec4 worldPosit = model * 1.0);\\n\\n vec4 clipPositi = projection * view * clipPositi += zOffset;\\n gl_Positio = value = f + kill = -1.0;\\n = uv.zw;\\n\\n vColor = vec2(value value));\\n //Don't do lighting for contours\\n surfaceNor = vec3(1,0,0 eyeDirecti = vec3(0,1,0 lightDirec = highp GLSLIFY 1\\n\\nbool a, float b, float p) {\\n return ((p > max(a, b)) || \\n (p 0.0) ||\\n clipBounds discard;\\n vec2 ux = / shape.x);\\ vec2 uy = / shape.y);\\ gl_FragCol = vec4(pickI ux.x, uy.x, ux.y + v=new k in O(t,e){var new invalid coordinate for kt=0;kt halfCharSt + halfCharWi 2){for(var n)return strict\";va new Invalid texture size\");var new Invalid shape for new Invalid shape for pixel new new Invalid arguments for texture2d instanceof instanceof instanceof ImageData& instanceof ImageData} f(t,e,r){v new Invalid texture size\");ret d(t,e){ret g(t){var new Invalid texture new Floating point textures not supported on this platform\") o=g(t);ret new Invalid ndarray, must be 2d or 3d\");var new Invalid shape for new Invalid shape for pixel new Incompatib texture format for new Error(\"gl- Too many vertex e=new t=new e=new t=new i=new n=new new Error(\"Mus have at least d+1 points\");v orient\");v i=new strict\";va new x(null);re new x(y(t))};v c(t,e){var u(t,e){var f(t,e){var i}}functio d(t,e){for r;return n;return strict\";va e,r,n;func a=\"var sharedChun = {}; r=i;else e=i}return n(t){retur i(t,e){ret a=o;functi new l=c;functi u(t,e,r,n) i=new new t instanceof p(t){for(v k(t){for(v 0,o=void of the original icon Sans Unicode MS t.kind}var i(t){retur null;var 1, 2, or 3 arguments, but found instead.\") i||!(i in ct))return e.error('T item type argument of \"array\" must be one of string, number, rbga value expected an array containing either three or four numeric new new ot(r||\"Cou not parse color from value r=!0;retur expression \"'+r+'\". If you wanted a literal array, use [\"literal\" new ot(\"Input is not a void an array with at least one element. If you wanted a literal array, use [\"literal\" []].');var r)return name must be a string, but found \"+typeof r+' instead. If you wanted a literal array, use [\"literal\" ut(a,i));e null}else instanceof at)&&funct t(e){if(e instanceof yt)return instanceof instanceof r=e instanceof ht||e instanceof lt||e instanceof ut,n=!0;re instanceof s=new dt;try{i=n i}return expression \"'+r+'\". If you wanted a literal array, use [\"literal\" void value invalid. Use null objects invalid. Use [\"literal\" {...}] an array, but found \"+typeof t+\" new pairs for \"step\" expression must be arranged with input values in strictly ascending order.',c) new t};var new e.error(\"C bezier interpolat requires four numeric arguments with values between 0 and pairs for \"interpola expression must be arranged with input values in strictly ascending order.',h) l.N?new \"+$(l)+\" is not new ot(\"Array index out of bounds: \"+e+\" > new ot(\"Array index must be an integer, but found \"+e+\" labels must be integers no larger than branch labels must be integer null}else labels must be null;var g?new Ut(t,e){va r=e[0];thr new instanceof r=e[0];ret typeof i==typeof typeof n==typeof typeof i==typeof typeof n==typeof e[0].value in r=e[0];ret ne(t){retu ie(t){retu me(t,e,r){ n=void 0,t);if(vo 0!==r&&voi 0!==n)retu _e(t){retu t[0]&&t[0] Rt}functio we(t,e){va r=new n?Gt(new in new ot(\"Expect value to be one of \")+\", but found instanceof t;var Yt([new N(\"\",\"prop expression not Yt([new N(\"\",\"zoom expression not a=function t(e){var r=null;if( instanceof if(e instanceof Mt)for(var D(e,r,r+\" is greater than the maximum value ze(t){var function may not have a \"stops\" must have at least one required property required property functions not functions not functions not property is f(t){var D(s,a,\"arr expected, \"+fe(a)+\" D(s,a,\"arr length 2 expected, length \"+a.length D(s,a,\"obj expected, \"+fe(a[0]) D(s,a,\"obj stop key must have D(s,a,\"obj stop key must have zoom values must appear in ascending h(t,n){var D(t.key,c, stop domain type must match previous stop domain type \"+e)]}else domain value must be a number, string, or u=\"number expected, \"+s+\" found\";ret you intended to use a categorica function, specify `\"type\": typeof t!=typeof He(t){retu t(e){var D(n,r,\"arr expected, \"+fe(r)+\" found\")];v D(n,r,'\"$t cannot be use with operator D(n,r,'fil array for operator \"'+r[0]+'\" must have 3 expected, \"+i+\" instanceof new Error(\"can serialize object of type \"+typeof t)}functio t||t instanceof Boolean||t instanceof Number||t instanceof String||t instanceof Date||t instanceof RegExp||t instanceof instanceof fr)return t){var new Error(\"can deserializ object of anonymous class\");va new Error(\"can deserializ unregister class a}throw new Error(\"can deserializ object of type \"+typeof t)}var 1;var t};var e in zr(r,void e(e,r){for n in i in Xr(t,e){vo Zr(t,e){re new must be implemente by each concrete StructArra layout\")}; n=2*r;retu a=4*i;retu s=6*o;retu c=8*l;retu i=3*n;retu r=1*e;retu s=6*o;retu n=4*r;retu r=1*e;retu i=3*n;retu i=3*n;retu n=2*r;retu n=2*r;retu a=4*i;retu new new new new new new vertices per segment is bucket requested exceeds allowed extent, reduce your vector tile buffer size\")}ret r}function Qn={paint: t=new t=new t=new t=new new e=t;return new of range source coordinate for image new of range destinatio coordinate for image copy\");for t;e||(e=t) e}function ki(t){var Ai(t,e,r){ n=t;do{var n}function o=t;do{for Si(t,e){re r}(t,e)){v Oi(t,e){re Ri(t,e){re Wi(t,e){va na(t){retu new Error(\"unk command if(7!==r)t new Error(\"unk command u(t){for(v n=new new Error(\"fea index out of new Mn};functi c=0;cc){va Ta=new r=new new r=[],n=new Da(t,e,r){ new Fa(t,e){va null;var e=new i in t){var oo(t){retu lo(t,e,r){ new type: wo=3;funct if(void if(void e(t,e,n){v r(t,e,r){v new of range source coordinate for DEM new tiles must be _('\"'+e+'\" is not a valid encoding type. Valid types include \"mapbox\" and a=n||i;ret r=[];for(v n in t)n in f(t,e){ret h(e,r,n){v new best %d after %d s=new many glyphs being rendered in a tile. See r=new a in e){var l in o){var _(e,r){for n=new t.id})))}} e=new x(c),r=new n in f){var a=f[n];a instanceof k(e,r){var r(o);var r[n],e()}; e[r]};var S(t){var $(t,e){for lt(t,e){va e,r,n}func ut(t){var ft(t){retu ht(t){var r in t}function dt(t){retu t.x}functi gt(t){retu t.y}functi Et(t){var e=[];retur o}function new Error(\"max should be in the 0-24 range\");va %d clusters in z%d-%d-%d (features: %d, points: %d, simplified null;var down to parent tile i)return e(new Error(\"Inp data is not a valid GeoJSON new e.data)ret r(new Error(\"Inp data is not a valid GeoJSON r(new Error(\"Inp data is not a valid GeoJSON new Error('Wor source with name \"'+t+'\" already new Error(\"RTL text plugin already Error(\"RTL Text Plugin failed to import scripts from self&&self instanceof t,e,r=new void new Error(\"fai to create canvas 2d null;for(v y(t,e){var new Error(\"An API access token is required to use Mapbox GL. new Error(\"Use a public access token (pk.*) with Mapbox GL, not a secret access token (sk.*). \"+m);retur x(t){retur t;var r=A(t);ret i=A(t);ret t;var Error(\"gly > 65535 not r in out of lat: }, or an array of [, ]\")};var this._ne=t instanceof G?new this._sw=t instanceof G?new instanceof instanceof Y))return this}retur new new new instanceof Y?t:new Y(t)};var r=this;ret r.fire(new r=this,n=v in this;var this};var n=!1;for(v i in e=(t-(void n={};for(v i in o in s in new in in n||(r=new r=this;t in t in Xt(e,r){va new $t(){retur new Qt(e,r){va n={};for(v i in in O}var T=new _e=new a in l in i){var u=new 4294967295 l in s){var if(i&&o){v l in i){var if(r)for(v i in new \")+\".\");re this.fire( Error(\"An image with this name already this.fire( Error(\"No image with this name new Error(\"The is already a source with this new Error(\"The type property must be defined, but the only the following properties were given: new Error(\"The is no source with this ID\");for(v r in this.fire( Error('Sou \"'+e+'\" cannot be removed while layer \"'+r+'\" is using it.')));va Error('Lay with id \"'+i+'\" already exists on this map')));el Error('Lay with id \"'+r+'\" does not exist on this Error('Lay with id \"'+r+'\" does not exist on this this.fire( Error(\"The layer '\"+e+\"' does not exist in the map's style and cannot be this.fire( Error(\"The layer '\"+e+\"' does not exist in the map's style and cannot be Error(\"The layer '\"+e+\"' does not exist in the map's style and cannot have zoom 0,void this.fire( Error(\"The layer '\"+e+\"' does not exist in the map's style and cannot be Error(\"The layer '\"+e+\"' does not exist in the map's style and cannot be this.fire( Error(\"The layer '\"+e+\"' does not exist in the map's style and cannot be e=this;ret void 0.5) {\\n gl_FragCol = vec4(0.0, 0.0, 1.0, 0.5) * alpha;\\n }\\n\\n if (v_notUsed > 0.5) {\\n // This box not used, fade it out\\n gl_FragCol *= .1;\\n vec2 vec2 vec2 vec2 mat4 vec2 float float float main() {\\n vec4 projectedP = u_matrix * 0, 1);\\n highp float = highp float = clamp(\\n 0.5 + 0.5 * / 0.0, // Prevents oversized near-field boxes in tiles\\n 4.0);\\n\\n gl_Positio = u_matrix * vec4(a_pos 0.0, 1.0);\\n gl_Positio += a_extrude * * gl_Positio * v_placed = a_placed.x v_notUsed = float float float float vec2 vec2 main() {\\n float alpha = 0.5;\\n\\n // Red = collision, hide label\\n vec4 color = vec4(1.0, 0.0, 0.0, 1.0) * alpha;\\n\\n // Blue = no collision, label is showing\\n if (v_placed > 0.5) {\\n color = vec4(0.0, 0.0, 1.0, 0.5) * alpha;\\n }\\n\\n if (v_notUsed > 0.5) {\\n // This box not used, fade it out\\n color *= .2;\\n }\\n\\n float = float extrude_le = * float stroke_wid = 15.0 * / float radius = v_radius * float = - radius);\\n float opacity_t = 0.0, gl_FragCol = opacity_t * vec2 vec2 vec2 vec2 mat4 vec2 float float float float vec2 vec2 main() {\\n vec4 projectedP = u_matrix * 0, 1);\\n highp float = highp float = clamp(\\n 0.5 + 0.5 * / 0.0, // Prevents oversized near-field circles in tiles\\n 4.0);\\n\\n gl_Positio = u_matrix * vec4(a_pos 0.0, 1.0);\\n\\n highp float padding_fa = 1.2; // Pad the vertices slightly to make room for anti-alias blur\\n gl_Positio += a_extrude * * padding_fa * gl_Positio * v_placed = a_placed.x v_notUsed = a_placed.y v_radius = // We don't pitch the circles, so both units of the extrusion vector are equal in magnitude to the radius\\n\\n v_extrude = a_extrude * = * * highp vec4 main() {\\n gl_FragCol = vec2 mat4 main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, mapbox: define highp vec4 color\\n#pr mapbox: define lowp float main() {\\n #pragma mapbox: initialize highp vec4 color\\n #pragma mapbox: initialize lowp float opacity\\n\\ gl_FragCol = color * gl_FragCol = vec2 mat4 mapbox: define highp vec4 color\\n#pr mapbox: define lowp float main() {\\n #pragma mapbox: initialize highp vec4 color\\n #pragma mapbox: initialize lowp float opacity\\n\\ gl_Positio = u_matrix * vec4(a_pos 0, mapbox: define highp vec4 mapbox: define lowp float vec2 v_pos;\\n\\n main() {\\n #pragma mapbox: initialize highp vec4 #pragma mapbox: initialize lowp float opacity\\n\\ float dist = length(v_p - float alpha = 1.0 - 1.0, dist);\\n gl_FragCol = outline_co * (alpha * gl_FragCol = vec2 mat4 vec2 vec2 mapbox: define highp vec4 mapbox: define lowp float main() {\\n #pragma mapbox: initialize highp vec4 #pragma mapbox: initialize lowp float opacity\\n\\ gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n v_pos = / gl_Positio + 1.0) / 2.0 * vec2 vec2 vec2 vec2 vec2 float sampler2D vec2 vec2 vec2 mapbox: define lowp float main() {\\n #pragma mapbox: initialize lowp float opacity\\n\\ vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos = / u_texsize, u_pattern_ / u_texsize, imagecoord vec4 color1 = pos);\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos2 = / u_texsize, u_pattern_ / u_texsize, vec4 color2 = pos2);\\n\\n // find distance to outline for alpha float dist = length(v_p - float alpha = 1.0 - 1.0, dist);\\n\\n gl_FragCol = mix(color1 color2, u_mix) * alpha * gl_FragCol = mat4 vec2 vec2 vec2 vec2 vec2 float float float vec2 vec2 vec2 vec2 mapbox: define lowp float main() {\\n #pragma mapbox: initialize lowp float opacity\\n\\ gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n\\n v_pos_a = u_scale_a * a_pos);\\n v_pos_b = u_scale_b * a_pos);\\n\\ v_pos = / gl_Positio + 1.0) / 2.0 * vec2 vec2 vec2 vec2 vec2 float sampler2D vec2 vec2 mapbox: define lowp float main() {\\n #pragma mapbox: initialize lowp float opacity\\n\\ vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos = / u_texsize, u_pattern_ / u_texsize, imagecoord vec4 color1 = pos);\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos2 = / u_texsize, u_pattern_ / u_texsize, vec4 color2 = pos2);\\n\\n gl_FragCol = mix(color1 color2, u_mix) * gl_FragCol = mat4 vec2 vec2 vec2 vec2 float float float vec2 vec2 vec2 mapbox: define lowp float main() {\\n #pragma mapbox: initialize lowp float opacity\\n\\ gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n\\n v_pos_a = u_scale_a * a_pos);\\n v_pos_b = u_scale_b * vec4 mapbox: define lowp float base\\n#pra mapbox: define lowp float mapbox: define highp vec4 color\\n\\nv main() {\\n #pragma mapbox: initialize lowp float base\\n #pragma mapbox: initialize lowp float height\\n #pragma mapbox: initialize highp vec4 color\\n\\n gl_FragCol = gl_FragCol = mat4 vec3 lowp vec3 lowp float vec2 vec4 vec4 mapbox: define lowp float base\\n#pra mapbox: define lowp float mapbox: define highp vec4 color\\n\\nv main() {\\n #pragma mapbox: initialize lowp float base\\n #pragma mapbox: initialize lowp float height\\n #pragma mapbox: initialize highp vec4 color\\n\\n vec3 normal = base = max(0.0, base);\\n height = max(0.0, height);\\n float t = mod(normal 2.0);\\n\\n gl_Positio = u_matrix * vec4(a_pos t > 0.0 ? height : base, 1);\\n\\n // Relative luminance (how dark/brigh is the surface color?)\\n float colorvalue = color.r * 0.2126 + color.g * 0.7152 + color.b * 0.0722;\\n\\ v_color = vec4(0.0, 0.0, 0.0, 1.0);\\n\\n // Add slight ambient lighting so no extrusions are totally black\\n vec4 ambientlig = vec4(0.03, 0.03, 0.03, 1.0);\\n color += // Calculate cos(theta) where theta is the angle between surface normal and diffuse light ray\\n float directiona = / 16384.0, u_lightpos 0.0, 1.0);\\n\\n // Adjust directiona so that\\n // the range of values for is narrower\\n // with lower light intensity\\ // and with surface colors\\n directiona = mix((1.0 - max((1.0 - colorvalue + 1.0), // Add gradient along z axis of side surfaces\\n if (normal.y != 0.0) {\\n directiona *= clamp((t + base) * pow(height / 150.0, 0.5), mix(0.7, 0.98, 1.0 - 1.0);\\n }\\n\\n // Assign final color based on surface + ambient light color, diffuse light directiona and light color\\n // with lower bounds adjusted to hue of light\\n // so that shading is tinted with the complement (opposite) color to the light color\\n v_color.r += clamp(colo * directiona * mix(0.0, 0.3, 1.0 - 1.0);\\n v_color.g += clamp(colo * directiona * mix(0.0, 0.3, 1.0 - 1.0);\\n v_color.b += clamp(colo * directiona * mix(0.0, 0.3, 1.0 - vec2 vec2 vec2 vec2 vec2 float sampler2D vec2 vec2 vec4 mapbox: define lowp float base\\n#pra mapbox: define lowp float height\\n\\n main() {\\n #pragma mapbox: initialize lowp float base\\n #pragma mapbox: initialize lowp float height\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos = / u_texsize, u_pattern_ / u_texsize, imagecoord vec4 color1 = pos);\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos2 = / u_texsize, u_pattern_ / u_texsize, vec4 color2 = pos2);\\n\\n vec4 mixedColor = mix(color1 color2, u_mix);\\n\\ gl_FragCol = mixedColor * gl_FragCol = mat4 vec2 vec2 vec2 vec2 float float float float vec3 lowp vec3 lowp float vec2 vec4 vec2 vec2 vec4 float mapbox: define lowp float base\\n#pra mapbox: define lowp float height\\n\\n main() {\\n #pragma mapbox: initialize lowp float base\\n #pragma mapbox: initialize lowp float height\\n\\n vec3 normal = float edgedistan = base = max(0.0, base);\\n height = max(0.0, height);\\n float t = mod(normal 2.0);\\n float z = t > 0.0 ? height : base;\\n\\n gl_Positio = u_matrix * vec4(a_pos z, 1);\\n\\n vec2 pos = normal.x == 1.0 && normal.y == 0.0 && normal.z == 16384.0\\n ? a_pos // extrusion top\\n : z * // extrusion side\\n\\n v_pos_a = u_scale_a * pos);\\n v_pos_b = u_scale_b * pos);\\n\\n v_lighting = vec4(0.0, 0.0, 0.0, 1.0);\\n float directiona = / 16383.0, u_lightpos 0.0, 1.0);\\n directiona = mix((1.0 - max((0.5 + 1.0), if (normal.y != 0.0) {\\n directiona *= clamp((t + base) * pow(height / 150.0, 0.5), mix(0.7, 0.98, 1.0 - 1.0);\\n }\\n\\n v_lighting += * u_lightcol mix(vec3(0 vec3(0.3), 1.0 - u_lightcol sampler2D float vec2 v_pos;\\n\\n main() {\\n gl_FragCol = v_pos) * gl_FragCol = mat4 vec2 vec2 vec2 v_pos;\\n\\n main() {\\n gl_Positio = u_matrix * vec4(a_pos * u_world, 0, 1);\\n\\n v_pos.x = a_pos.x;\\n v_pos.y = 1.0 - highp sampler2D vec2 vec2 float float coord, float bias) {\\n // Convert encoded elevation value to meters\\n vec4 data = coord) * 255.0;\\n return (data.r + data.g * 256.0 + data.b * 256.0 * 256.0) / main() {\\n vec2 epsilon = 1.0 / // queried pixels:\\n // // | | | |\\n // | a | b | c |\\n // | | | |\\n // // | | | |\\n // | d | e | f |\\n // | | | |\\n // // | | | |\\n // | g | h | i |\\n // | | | |\\n // float a = + -epsilon.y 0.0);\\n float b = + vec2(0, -epsilon.y 0.0);\\n float c = + -epsilon.y 0.0);\\n float d = + 0), 0.0);\\n float e = 0.0);\\n float f = + 0), 0.0);\\n float g = + epsilon.y) 0.0);\\n float h = + vec2(0, epsilon.y) 0.0);\\n float i = + epsilon.y) 0.0);\\n\\n // here we divide the x and y slopes by 8 * pixel size\\n // where pixel size (aka meters/pix is:\\n // circumfere of the world / (pixels per tile * number of tiles)\\n // which is equivalent to: 8 * / (512 * pow(2, u_zoom))\\n // which can be reduced to: pow(2, 19.2561997 - u_zoom)\\n // we want to vertically exaggerate the hillshadin though, because otherwise\\ // it is barely noticeable at low zooms. to do this, we multiply this by some\\n // scale factor pow(2, (u_zoom - u_maxzoom) * a) where a is an arbitrary value\\n // Here we use a=0.3 which works out to the expression below. see \\n // nickidluga awesome breakdown for more info\\n // float exaggerati = u_zoom 0.0 ? 1.0 : -1.0);\\n\\n float intensity = u_light.x; // We add PI to make this property match the global light object, which adds PI/2 to the light's azimuthal\\ // position property to account for 0deg correspond to north/the top of the viewport in the style spec\\n // and the original shader was written to accept - 90) as the azimuthal. float azimuth = u_light.y + PI;\\n\\n // We scale the slope exponentia based on intensity, using a calculatio similar to\\n // the exponentia interpolat function in the style spec:\\n // // so that higher intensity values create more opaque hillshadin float base = 1.875 - intensity * 1.75;\\n float maxValue = 0.5 * PI;\\n float scaledSlop = intensity != 0.5 ? ((pow(base slope) - 1.0) / (pow(base, maxValue) - 1.0)) * maxValue : slope;\\n\\n // The accent color is calculated with the cosine of the slope while the shade color is calculated with the sine\\n // so that the accent color's rate of change eases in while the shade color's eases out.\\n float accent = // We multiply both the accent and shade color by a clamped intensity value\\n // so that intensitie >= 0.5 do not additional affect the color values\\n // while intensity values 0.0 ? ANTIALIASI : 0.0);\\n float outset = gapwidth + halfwidth * (gapwidth > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset2 = offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n vec4 = u_matrix * vec4(dist / u_ratio, 0.0, 0.0);\\n gl_Positio = u_matrix * vec4(pos + offset2 / u_ratio, 0.0, 1.0) + // calculate how much the perspectiv view squishes or stretches the extrude\\n float = float = / gl_Positio * v_gamma_sc = / v_width2 = vec2(outse mapbox: define lowp float blur\\n#pra mapbox: define lowp float sampler2D vec2 vec2 float highp float main() {\\n #pragma mapbox: initialize lowp float blur\\n #pragma mapbox: initialize lowp float opacity\\n\\ // Calculate the distance of the pixel from the line in pixels.\\n float dist = * // Calculate the antialiasi fade factor. This is either when fading in\\n // the line in case of an offset line (v_width2. or when fading out\\n // (v_width2. float blur2 = (blur + 1.0 / * float alpha = clamp(min( - (v_width2. - blur2), v_width2.s - dist) / blur2, 0.0, 1.0);\\n\\n // For gradient lines, v_lineprog is the ratio along the entire line,\\n // scaled to [0, 2^15), and the gradient ramp is stored in a texture.\\n vec4 color = 0.5));\\n\\n gl_FragCol = color * (alpha * gl_FragCol = the attribute conveying progress along a line is scaled to [0, 2^15)\\n#de 32767.0\\n\\ the distance over which the line edge fades out.\\n// Retina devices need a smaller distance to avoid ANTIALIASI 1.0 / / 2.0\\n\\n// floor(127 / 2) == 63.0\\n// the maximum allowed miter limit is 2.0 at the moment. the extrude normal is\\n// stored in a byte (-128..127 we scale regular normals up to length 63, but\\n// there are also \\\"special\\ normals that have a bigger length (of up to 126 in\\n// this case).\\n// #define scale 63.0\\n#def scale vec4 vec4 mat4 mediump float vec2 vec2 vec2 float highp float mapbox: define lowp float blur\\n#pra mapbox: define lowp float mapbox: define mediump float mapbox: define lowp float mapbox: define mediump float width\\n\\nv main() {\\n #pragma mapbox: initialize lowp float blur\\n #pragma mapbox: initialize lowp float opacity\\n #pragma mapbox: initialize mediump float gapwidth\\n #pragma mapbox: initialize lowp float offset\\n #pragma mapbox: initialize mediump float width\\n\\n vec2 a_extrude = a_data.xy - 128.0;\\n float a_directio = mod(a_data 4.0) - 1.0;\\n\\n v_lineprog = / 4.0) + a_data.w * 64.0) * 2.0 / vec2 pos = // x is 1 if it's a round cap, 0 otherwise\\ // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = v_normal = normal;\\n\\ // these used to be applied in the JS and native code bases.\\n // moved them into the shader for clarity and simplicity gapwidth = gapwidth / 2.0;\\n float halfwidth = width / 2.0;\\n offset = -1.0 * offset;\\n\\ float inset = gapwidth + (gapwidth > 0.0 ? ANTIALIASI : 0.0);\\n float outset = gapwidth + halfwidth * (gapwidth > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset2 = offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n vec4 = u_matrix * vec4(dist / u_ratio, 0.0, 0.0);\\n gl_Positio = u_matrix * vec4(pos + offset2 / u_ratio, 0.0, 1.0) + // calculate how much the perspectiv view squishes or stretches the extrude\\n float = float = / gl_Positio * v_gamma_sc = / v_width2 = vec2(outse vec2 vec2 vec2 vec2 vec2 vec2 vec2 float sampler2D vec2 vec2 float float mapbox: define lowp float blur\\n#pra mapbox: define lowp float main() {\\n #pragma mapbox: initialize lowp float blur\\n #pragma mapbox: initialize lowp float opacity\\n\\ // Calculate the distance of the pixel from the line in pixels.\\n float dist = * // Calculate the antialiasi fade factor. This is either when fading in\\n // the line in case of an offset line (v_width2. or when fading out\\n // (v_width2. float blur2 = (blur + 1.0 / * float alpha = clamp(min( - (v_width2. - blur2), v_width2.s - dist) / blur2, 0.0, 1.0);\\n\\n float x_a = / 1.0);\\n float x_b = / 1.0);\\n\\n // v_normal.y is 0 at the midpoint of the line, -1 at the lower edge, 1 at the upper edge\\n // we clamp the line width outset to be between 0 and half the pattern height plus padding (2.0)\\n // to ensure we don't sample outside the designated symbol on the sprite sheet.\\n // 0.5 is added to shift the component to be bounded between 0 and 1 for interpolat of\\n // the texture coordinate float y_a = 0.5 + (v_normal. * 0.0, + 2.0) / 2.0) / float y_b = 0.5 + (v_normal. * 0.0, + 2.0) / 2.0) / vec2 pos_a = / u_texsize, u_pattern_ / u_texsize, vec2(x_a, y_a));\\n vec2 pos_b = / u_texsize, u_pattern_ / u_texsize, vec2(x_b, y_b));\\n\\n vec4 color = pos_a), pos_b), u_fade);\\n gl_FragCol = color * alpha * gl_FragCol = floor(127 / 2) == 63.0\\n// the maximum allowed miter limit is 2.0 at the moment. the extrude normal is\\n// stored in a byte (-128..127 we scale regular normals up to length 63, but\\n// there are also \\\"special\\ normals that have a bigger length (of up to 126 in\\n// this case).\\n// #define scale 63.0\\n#def scale We scale the distance before adding it to the buffers so that we can store\\n// long distances for long segments. Use this value to unscale the 2.0\\n\\n// the distance over which the line edge fades out.\\n// Retina devices need a smaller distance to avoid ANTIALIASI 1.0 / / vec4 vec4 mat4 mediump float vec2 vec2 vec2 float float mapbox: define lowp float blur\\n#pra mapbox: define lowp float mapbox: define lowp float mapbox: define mediump float mapbox: define mediump float width\\n\\nv main() {\\n #pragma mapbox: initialize lowp float blur\\n #pragma mapbox: initialize lowp float opacity\\n #pragma mapbox: initialize lowp float offset\\n #pragma mapbox: initialize mediump float gapwidth\\n #pragma mapbox: initialize mediump float width\\n\\n vec2 a_extrude = a_data.xy - 128.0;\\n float a_directio = mod(a_data 4.0) - 1.0;\\n float a_linesofa = / 4.0) + a_data.w * 64.0) * vec2 pos = // x is 1 if it's a round cap, 0 otherwise\\ // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = v_normal = normal;\\n\\ // these used to be applied in the JS and native code bases.\\n // moved them into the shader for clarity and simplicity gapwidth = gapwidth / 2.0;\\n float halfwidth = width / 2.0;\\n offset = -1.0 * offset;\\n\\ float inset = gapwidth + (gapwidth > 0.0 ? ANTIALIASI : 0.0);\\n float outset = gapwidth + halfwidth * (gapwidth > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset2 = offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n vec4 = u_matrix * vec4(dist / u_ratio, 0.0, 0.0);\\n gl_Positio = u_matrix * vec4(pos + offset2 / u_ratio, 0.0, 1.0) + // calculate how much the perspectiv view squishes or stretches the extrude\\n float = float = / gl_Positio * v_gamma_sc = / v_linesofa = a_linesofa v_width2 = vec2(outse sampler2D float float vec2 vec2 vec2 vec2 float mapbox: define highp vec4 color\\n#pr mapbox: define lowp float blur\\n#pra mapbox: define lowp float mapbox: define mediump float width\\n#pr mapbox: define lowp float main() {\\n #pragma mapbox: initialize highp vec4 color\\n #pragma mapbox: initialize lowp float blur\\n #pragma mapbox: initialize lowp float opacity\\n #pragma mapbox: initialize mediump float width\\n #pragma mapbox: initialize lowp float floorwidth // Calculate the distance of the pixel from the line in pixels.\\n float dist = * // Calculate the antialiasi fade factor. This is either when fading in\\n // the line in case of an offset line (v_width2. or when fading out\\n // (v_width2. float blur2 = (blur + 1.0 / * float alpha = clamp(min( - (v_width2. - blur2), v_width2.s - dist) / blur2, 0.0, 1.0);\\n\\n float sdfdist_a = v_tex_a).a float sdfdist_b = v_tex_b).a float sdfdist = mix(sdfdis sdfdist_b, u_mix);\\n alpha *= smoothstep - u_sdfgamma / floorwidth 0.5 + u_sdfgamma / floorwidth sdfdist);\\ gl_FragCol = color * (alpha * gl_FragCol = floor(127 / 2) == 63.0\\n// the maximum allowed miter limit is 2.0 at the moment. the extrude normal is\\n// stored in a byte (-128..127 we scale regular normals up to length 63, but\\n// there are also \\\"special\\ normals that have a bigger length (of up to 126 in\\n// this case).\\n// #define scale 63.0\\n#def scale We scale the distance before adding it to the buffers so that we can store\\n// long distances for long segments. Use this value to unscale the 2.0\\n\\n// the distance over which the line edge fades out.\\n// Retina devices need a smaller distance to avoid ANTIALIASI 1.0 / / vec4 vec4 mat4 mediump float vec2 float vec2 float vec2 vec2 vec2 vec2 vec2 float mapbox: define highp vec4 color\\n#pr mapbox: define lowp float blur\\n#pra mapbox: define lowp float mapbox: define mediump float mapbox: define lowp float mapbox: define mediump float width\\n#pr mapbox: define lowp float main() {\\n #pragma mapbox: initialize highp vec4 color\\n #pragma mapbox: initialize lowp float blur\\n #pragma mapbox: initialize lowp float opacity\\n #pragma mapbox: initialize mediump float gapwidth\\n #pragma mapbox: initialize lowp float offset\\n #pragma mapbox: initialize mediump float width\\n #pragma mapbox: initialize lowp float floorwidth vec2 a_extrude = a_data.xy - 128.0;\\n float a_directio = mod(a_data 4.0) - 1.0;\\n float a_linesofa = / 4.0) + a_data.w * 64.0) * vec2 pos = // x is 1 if it's a round cap, 0 otherwise\\ // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = v_normal = normal;\\n\\ // these used to be applied in the JS and native code bases.\\n // moved them into the shader for clarity and simplicity gapwidth = gapwidth / 2.0;\\n float halfwidth = width / 2.0;\\n offset = -1.0 * offset;\\n\\ float inset = gapwidth + (gapwidth > 0.0 ? ANTIALIASI : 0.0);\\n float outset = gapwidth + halfwidth * (gapwidth > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist =outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset2 = offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n vec4 = u_matrix * vec4(dist / u_ratio, 0.0, 0.0);\\n gl_Positio = u_matrix * vec4(pos + offset2 / u_ratio, 0.0, 1.0) + // calculate how much the perspectiv view squishes or stretches the extrude\\n float = float = / gl_Positio * v_gamma_sc = / v_tex_a = * / floorwidth normal.y * + u_tex_y_a) v_tex_b = * / floorwidth normal.y * + v_width2 = vec2(outse float float sampler2D sampler2D vec2 vec2 float float float float vec3 main() {\\n\\n // read and cross-fade colors from the main and parent tiles\\n vec4 color0 = v_pos0);\\n vec4 color1 = v_pos1);\\n if (color0.a > 0.0) {\\n color0.rgb = color0.rgb / color0.a;\\ }\\n if (color1.a > 0.0) {\\n color1.rgb = color1.rgb / color1.a;\\ }\\n vec4 color = mix(color0 color1, u_fade_t); color.a *= u_opacity; vec3 rgb = color.rgb; // spin\\n rgb = vec3(\\n dot(rgb, dot(rgb, dot(rgb, // saturation float average = (color.r + color.g + color.b) / 3.0;\\n rgb += (average - rgb) * // contrast\\n rgb = (rgb - 0.5) * + 0.5;\\n\\n // brightness vec3 u_high_vec = vec3 u_low_vec = gl_FragCol = u_low_vec, rgb) * color.a, gl_FragCol = mat4 vec2 float float vec2 vec2 vec2 vec2 main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n // We are using Int16 for texture position coordinate to give us enough precision for\\n // fractional coordinate We use 8192 to scale the texture coordinate in the buffer\\n // as an arbitraril high number to preserve adequate precision when rendering. // This is also the same value as the EXTENT we are using for our tile buffer pos coordinate // so math for modifying either is consistent v_pos0 = / 8192.0) - 0.5) / u_buffer_s ) + 0.5;\\n v_pos1 = (v_pos0 * + sampler2D mapbox: define lowp float vec2 float main() {\\n #pragma mapbox: initialize lowp float opacity\\n\\ lowp float alpha = opacity * gl_FragCol = v_tex) * gl_FragCol = float PI = vec4 vec4 vec3 float bool bool highp float u_size_t; // used to interpolat between zoom stops when size is a composite highp float u_size; // used when size is both zoom and feature highp float highp float bool highp float float mapbox: define lowp float mat4 mat4 mat4 bool bool vec2 vec2 float main() {\\n #pragma mapbox: initialize lowp float opacity\\n\\ vec2 a_pos = vec2 a_offset = vec2 a_tex = a_data.xy; vec2 a_size = a_data.zw; highp float segment_an = float size;\\n if && {\\n size = mix(a_size a_size[1], u_size_t) / 10.0;\\n } else if && {\\n size = a_size[0] / 10.0;\\n } else if && {\\n size = u_size;\\n } else {\\n size = u_size;\\n }\\n\\n vec4 projectedP = u_matrix * vec4(a_pos 0, 1);\\n highp float = // See comments in highp float distance_r = ?\\n / :\\n / highp float = clamp(\\n 0.5 + 0.5 * 0.0, // Prevents oversized near-field symbols in tiles\\n 4.0);\\n\\n size *= float fontScale = u_is_text ? size / 24.0 : size;\\n\\n highp float = 0.0;\\n if {\\n // See comments in vec4 = u_matrix * vec4(a_pos + vec2(1, 0), 0, 1);\\n\\n vec2 a = / vec2 b = / = atan((b.y - a.y) / b.x - a.x);\\n }\\n\\n highp float angle_sin = + highp float angle_cos = + mat2 = -1.0 * angle_sin, angle_sin, vec4 projected_ = * 0.0, 1.0);\\n gl_Positio = * / + * (a_offset / 32.0 * fontScale) 0.0, 1.0);\\n\\n v_tex = a_tex / u_texsize; vec2 fade_opaci = float fade_chang = > 0.5 ? u_fade_cha : v_fade_opa = max(0.0, min(1.0, + SDF_PX 8.0\\n#defi EDGE_GAMMA bool mapbox: define highp vec4 mapbox: define highp vec4 mapbox: define lowp float mapbox: define lowp float mapbox: define lowp float sampler2D highp float bool vec2 vec3 main() {\\n #pragma mapbox: initialize highp vec4 fill_color #pragma mapbox: initialize highp vec4 halo_color #pragma mapbox: initialize lowp float opacity\\n #pragma mapbox: initialize lowp float halo_width #pragma mapbox: initialize lowp float halo_blur\\ vec2 tex = v_data0.xy float gamma_scal = v_data1.x; float size = v_data1.y; float fade_opaci = float fontScale = u_is_text ? size / 24.0 : size;\\n\\n lowp vec4 color = fill_color highp float gamma = EDGE_GAMMA / (fontScale * lowp float buff = (256.0 - 64.0) / 256.0;\\n if (u_is_halo {\\n color = halo_color gamma = (halo_blur * 1.19 / SDF_PX + EDGE_GAMMA / (fontScale * buff = (6.0 - halo_width / fontScale) / SDF_PX;\\n }\\n\\n lowp float dist = tex).a;\\n highp float gamma_scal = gamma * gamma_scal highp float alpha = - gamma_scal buff + gamma_scal dist);\\n\\n gl_FragCol = color * (alpha * opacity * gl_FragCol = float PI = vec4 vec4 vec3 float contents of a_size vary based on the type of property value\\n// used for For constants, a_size is disabled.\\ For source functions, we bind only one value per vertex: the value of evaluated for the current feature.\\n For composite functions: [ feature),\\ feature) ]\\nuniform bool bool highp float u_size_t; // used to interpolat between zoom stops when size is a composite highp float u_size; // used when size is both zoom and feature mapbox: define highp vec4 mapbox: define highp vec4 mapbox: define lowp float mapbox: define lowp float mapbox: define lowp float mat4 mat4 mat4 bool bool highp float bool highp float highp float float vec2 vec2 vec3 main() {\\n #pragma mapbox: initialize highp vec4 fill_color #pragma mapbox: initialize highp vec4 halo_color #pragma mapbox: initialize lowp float opacity\\n #pragma mapbox: initialize lowp float halo_width #pragma mapbox: initialize lowp float halo_blur\\ vec2 a_pos = vec2 a_offset = vec2 a_tex = a_data.xy; vec2 a_size = a_data.zw; highp float segment_an = float size;\\n\\n if && {\\n size = mix(a_size a_size[1], u_size_t) / 10.0;\\n } else if && {\\n size = a_size[0] / 10.0;\\n } else if && {\\n size = u_size;\\n } else {\\n size = u_size;\\n }\\n\\n vec4 projectedP = u_matrix * vec4(a_pos 0, 1);\\n highp float = // If the label is pitched with the map, layout is done in pitched space,\\n // which makes labels in the distance smaller relative to viewport space.\\n // We counteract part of that effect by multiplyin by the perspectiv ratio.\\n // If the label isn't pitched with the map, we do layout in viewport space,\\n // which makes labels in the distance larger relative to the features around\\n // them. We counteract part of that effect by dividing by the perspectiv ratio.\\n highp float distance_r = ?\\n / :\\n / highp float = clamp(\\n 0.5 + 0.5 * 0.0, // Prevents oversized near-field symbols in tiles\\n 4.0);\\n\\n size *= float fontScale = u_is_text ? size / 24.0 : size;\\n\\n highp float = 0.0;\\n if {\\n // Point labels with map' are horizontal with respect to tile units\\n // To figure out that angle in projected space, we draw a short horizontal line in tile\\n // space, project it, and measure its angle in projected space.\\n vec4 = u_matrix * vec4(a_pos + vec2(1, 0), 0, 1);\\n\\n vec2 a = / vec2 b = / = atan((b.y - a.y) / b.x - a.x);\\n }\\n\\n highp float angle_sin = + highp float angle_cos = + mat2 = -1.0 * angle_sin, angle_sin, vec4 projected_ = * 0.0, 1.0);\\n gl_Positio = * / + * (a_offset / 32.0 * fontScale) 0.0, 1.0);\\n float gamma_scal = vec2 tex = a_tex / u_texsize; vec2 fade_opaci = float fade_chang = > 0.5 ? u_fade_cha : float = max(0.0, min(1.0, + v_data0 = vec2(tex.x tex.y);\\n v_data1 = size, mapbox: ([\\w]+) ([\\w]+) ([\\w]+) \"+n+\" \"+i+\" \"+n+\" \"+i+\" \"+n+\" \"+i+\" \"+a+\" = lowp float \"+n+\" \"+o+\" \"+n+\" \"+i+\" \"+n+\" \"+i+\" \"+a+\" = \"+n+\" \"+i+\" \"+a+\" = lowp float \"+n+\" \"+o+\" \"+n+\" \"+i+\" \"+n+\" \"+i+\" \"+a+\" = \"+n+\" \"+i+\" \"+a+\" = rr in Qe)er(rr); Er(e,r,n){ t=new new 0===n&&voi new Error(\"fai to invert 1;var r(r,n,i){v Jr(t){retu 61:case 107:case 171:case 189:case 109:case must be a positive number, or an Object with keys 'bottom', 'left', 'right', e){var k=m*m;func T(t){var | new Error(\"max must be greater than minZoom\"); n=new on;var new not instanceof ln))throw new Error(\"Inv type: 'container must be a String or a in en)t[a]=ne n=new n=new r=new r=new e=new new Error(\"max must be greater than the current 0===n)retu s in 0===n)retu vn(t,e,r){ i in 0 27 41\");var device does not support fullscreen r){var instanceof e(new r=new r};var l(t){var c(t,s){var s;return n(t){retur t)return 1=0)return 1 specify vertex creation specify cell creation specify phase new Invalid boundary t in s){var t in l){var t in c){var return \"+n),s?new n=[\"'use b(e,r){var o=\"__l\"+ a=\"__l\"+ n.push(\"va strict\";va strict\";va o(t,e){ret s(){var [2,1,0];}e [1,0,2];}} [2,0,1];}e new new a.push(\"va function new 0===r){r=n i=new t||\"up\"in i};var u(t,e){var strict\";va r?r+\"\":\" e=new a(e,a,o){v have circular dependency Please, check o=new a?r:functi references in a=new references in i}return t(e,r){ret \"+r+\"is t[r]}}func a(t,e){ret strict\";va l left Left\",top: t top Top\",width width W height W bottom right e=[];retur 0:return r||[];case 1:return 2:return f(t,r){var l};var Array(g),m t-e});var r};var u(t,e){for r=new u=y[a];voi null;retur s=1;a;){va s}};return u(t,n){var n;var v(){if(d){ new Zero-lengt segment detected; your epsilon is probably too small or too r=1;r0){va strict\";va extension should be enabled\"); highp vec2 position, vec4 vec4 vec2 direction, lineOffset vec4 float lineWidth, vec2 scale, scaleFract translate, vec4 main() = color / pixelOffse = lineWidth * lineOffset + (capSize + lineWidth) * dxy = -step(.5, direction. * error.xz + vec2(-.5)) * position = position + pos = (position + translate) * (positionF + * (position + translate) * (positionF + * += pixelOffse / = vec4(pos * 2. - 1., 0, highp vec4 float main() = *= minus src minus dst e)return u=x[c];ret colors cap capsize line-width width line position data t[0]){var a=0;a 0. && baClipping 0. && abClipping cutoff + .5) -= - .5, cutoff + .5, == 1.) = endCutoff. (distToEnd cutoff + .5) -= - .5, cutoff + .5, t = / dashSize) * .5 + .25;\\n\\tfl dash = vec2(t, = *= alpha * opacity * highp GLSLIFY 1\\n\\nattri vec2 position, vec4 vec2 scale, scaleFract translate, float pixelRatio id;\\nunifo vec4 float vec4 float MAX_LINES = 256.;\\n\\nv main() {\\n\\tfloat depth = (MAX_LINES - 4. - id) / position = position * scale + translate\\ + positionFr * scale + + position * scaleFract + positionFr * = vec4(posit * 2.0 - 1.0, depth, = color / *= highp GLSLIFY 1\\n\\nvaryi vec4 main() = points data lineWidth lineWidths line-width linewidth width stroke-wid strokewidt linejoin join type dashes dasharray dash-array colour stroke colors colours stroke-col fill-color crease overlap close closed-pat hole 1.0 + delta) -= smoothstep - delta, 1.0 + delta, borderRadi = ratio = - delta, borderRadi + delta, color = mix(fragCo *= alpha * = highp GLSLIFY 1\\n\\nattri float x, y, xFract, float size, vec4 colorId, float vec2 scale, scaleFract translate, float sampler2D vec2 float maxSize = vec4 fragColor, float isDirect = (paletteSi e[0]){for( a;if(t instanceof Uint8Array instanceof s(){functi 5120:n=new 5121:n=new 5122:n=new 5123:n=new 5124:n=new 5125:n=new 5126:n=new null}retur s}var e||(e=new t&&t._elem instanceof g(t){for(v t}function L(t){for(v t=0;return n(t,e){var f}var h=new a=m();retu for(var u(t){var h(t){retur m(e){var a(t,e){for t=0;return a=i[t];ret a||(a=new in n(t){if(t in i){var in a){var c=a[t];ret in in e?new o=t;t=new e(e,a){if( in r){var s})}else if(t in n){var n(t,n){ret in in in i(t){retur r.def('\"', in in o(){functi t=0;return o(e,r){var c=new blend.equa stencil.fu stencil.op viewport scissor.bo minus src minus src minus dst minus dst minus constant minus constant alpha u(){var null;var r(e){var tt(t,l);el t=0;t=r)re n}function u(t){retur new e=new \",e);var s=new o;n=-(i+a) i(n(t))};v i}function u(t,e){for r=new r}function x(t){for(v e=m(t);;){ t=T[0];ret w(t,e){var r=T[t];ret T=[],A=new l}else if(c)retur l}else if(c)retur c;return a[0]-s[0]} i(t,e){var t;var r}function i=p.index; strict\";va a(t,e){var o(t,e,r,n) i[e];var new unexpected new failed to parse named argument new failed to parse named argument new mixing positional and named placeholde is not (yet) n(t,r){ret strict\";va Error(\"Fir argument should be a should be a string or a b=new Array(y),d c)|0 d=new Array(r),g Array(r),v Array(r),m r};var strict\";va should be valid svg path n;var n=!1;var e=new m=new t(e,r,i){v i=i||{};va \":{data:ne p=new a}function t){var r={};for(v n in e={};for(v r in e}(S);func C(t){retur in o(t){var e=t1)for(v i})}}var new Error(\"n must be new Error(\"alr in h(t){retur new p(t){retur new d(t){retur new g(t){retur new v(t){retur new m(t){retur new y(t){retur new x(t){retur new b(t){retur o?new _(t){retur new null}retur t}).join(\" \");var \",\"italic bold \"):\"bold n}function b(t,e){var _(t,e,r,n) n=y(e);ret n?r in o?r in o&&delete p,d=new al-ahad\",\" {0} not {0} {0} {0} mix {0} and {1} format a date from another number at position name at position literal at position text found at dd M MM d, d M d M d M d M yyyy\",RSS: d M l){y(\"m\"); a=this;ret var _inline_1_ = - var _inline_1_ = - >= 0) !== (_inline_1 >= 0)) {\\n + 0.5 + 0.5 * (_inline_1 + _inline_1_ / (_inline_1 - }\\n r=[];retur 0 1,1 0,-2A2,2 0 0,1 strict\";va strict\";va o(t){var s(t,e){var strict\";va o(t,e){var void void z();var void z();var strict\";va t}var a?\"rgba(\"+ i=n(t);ret t){var 0!==i&&voi 0!==a){var strict\";va strict\";va 0;var strict\";va strict\";va r(t,e){var o(t,i){var r}function strict\";va strict\";va f(){var h(t){retur r;try{r=ne strict\";va strict\";va t){var 0!==u,d=vo e=void 0!==x,w=vo 0!==b;retu E=.5;funct C(t,e,r,i) \")}).split \")}).split scale(\"+e+ strict\";va 0 1,1 0 0,1 \"+a+\",\"+a+ 0 0 1 \"+a+\",\"+a+ 0 0 1 \"+r+\",\"+r+ 0 0 1 \"+r+\",\"+r+ 0 0 1 0 1,1 0 0,1 0 1,1 0 0,1 strict\";va l(t,e,r,i) t.id});var o.remove() strict\";va strict\";va strict\";va o(t){retur d(t){var A(t,e,r){v x,b,_=\"top to delete T(t,e){ret Array(g);v w(t,e){ret strict\";va T(t,e,r){v void t.remove() A(t,e){var t;for(var \";return t}function M(t,e){var S(t,e,r){v if(l){var E(t){var strict\";va y(t,e,r){v x(t,e,r){v i=m(void A=m(void m(t,e,r,n) r[1]}retur o}function y(t){retur strict\";va u(t,e){var v8h2v-8 h8v-2h-8 v-8h-2 if(P){var h2 v-18 v2 h-18 d(t,e,r){v g(t,e){var extra params in segment 0 1,1 0 0,1 r in S(t,e){var E(t,e){var C(t,e,r){v L(t,e){var r(r,i){ret strict\";va p(t){retur c(t,e){ret t){var ms \"+t+\" in calendar \"+r)}var c=new e=new o*o+s*s}va 0;var n[r];var h(e){var new Error(\"No DOM element with id '\"+t+\"' exists on the page.\");re new Error(\"DOM element provided is null or strict\";va f(t,e){var r=t;return c;var e=a(t);ret p(t){retur void void r=new failed n;function i(){return r={};retur strict\";va property strict\";va instanceof if(!(void c(t,e){ret binary r=e%1;retu strict\";va strict\";va s(t,e){ret was an error in the tex void O(),void e();var \");var u(){c++;va s=1;s doesnt match end tag . Pretending it did unexpected end tag null;var n&&E(n)}va r=void E(t){retur e(t);var C(t,e,r){v i(e,r){ret void new 1px l(t){var strict\";va n={};funct s in e=n[t];ret e&&e.timer n[t];else for(var e in strict\";va strict\";va to enter Colorscale title\":\"Cl to enter Colorscale to enter Colorscale title\":\"Cl to enter Colourscal %b %e %X %-d, strict\";va previous rejected promises from t.scene1); in array edits are incompatib with other edits\",f); full array edit out of if(void & removal are incompatib with edits to the same full object edit new Error(\"eac index in \"+r+\" must be new Error(\"gd. must be an e)throw new is a required new Error(\"cur and new indices must be of equal new Error(\"gd. must be an new Error(\"upd must be a key:value r)throw new Error(\"ind must be an integer or array of a in new \"+a+\" must be an array of length equal to indices array in new Error(\"whe maxPoints is set as a key:value object it must contain a 1:1 corrispond with the keys and number of traces in the update d in t[e]}}func 0);var G(t,e,r){v i in a in c in Y(t,e){var i in e){var o in $(t,e){var Y in r;return l(t){retur c(t,e){var r=0;return t()}}retur void m(t){retur addFrames accepts frames with numeric names, but the numbers areimplici cast to new Error(\"gd. must be an e)throw new Error(\"tra must be t;var t=a[i]}els t=a}}retur t}function k(t){retur T(t){retur t(e){for(v r in h(){var t=l;return u(),t}retu i=h();for( a in strict\";va w(t){var x(t,e,r){v t(e,r){for n in strict\";va d(t){retur in new Error(\"Hei and width should be pixel new Error(\"Ima format is not jpeg, png, svg or webp.\");va g={};funct v(t,r){ret S(){return new E(){return new new strict\";va e}}var strict\";va u=new V(t,e,r){v r;function n(t){retur t.dtick){v error: t+o*e;var dtick dtick Y(t,e){var X(t,e){var 0}function Z(t,e){var to zoom back V(t){var strict\";va strict\";va y(t){retur t._id}func S(t){var E(t,e,r){v strict\";va m(t){retur y(t){retur r={},n=0;n g(e,r){var strict\";va o(t,e){var 0===t[r]&& v(t){retur n(t){var i(t){retur in in in strict\";va u(){}var m(t){retur i(t,r){for C(t,e){var z(t,e){var I(t,e){var D(t,e){var strict\";va u(t,e){ret f(t,e,r){v h(t,e){var i(r){var p(t,e){var y(t){retur x(r){var n=e(r);ret d(t,e){var strict\";va strict\";va e=0;e/g,\" l(t){var n=new a(t,e);ret strict\";va Sans Regular, Arial Unicode MS strict\";va r(r,i){ret strict\";va g(t){var v(t){retur new p=s.map=ne b(){var - _(t,e){var o(t){for(v i=0;i0){va in h in l=void delete i[e],delet m in d)g[m]||de d[m];for(v y in M in S(t,e,r){v n=!1;var l(){return t)return e,n,i={};f in i}return r=c(t);ret e&&delete I=(new + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + 0px\",\"1px -1px\",\"-1p 1px\",\"1px \"+t+\" 0 W(t,e){ret \"+n+\" \"+n+\" \"+n+\" u=\"t: \"+c.t+\", r: l;var r in t)r in r in r=e||6;ret t)return null;var r(){var n.mode,del r}};return strict\";va s;function h(r,s){ret i(t,e){ret n=e[r];ret d(t,e,r,n) o(r,n){ret strict\";va D(t){var e={};retur P(t,e){var R(t,e){ret F(t,e){ret N(i,a){var to zoom back V(t,e){var 0 \"+d+\" \"+g));var _=(new strict\";va m(t){for(v x(t){retur e}return void r;return o;return A(e,r,n){v i}var c(t){var e=o?r:n;re _=0;m(\"per of of of strict\";va strict\";va strict\";va strict\";va 0, 0, strict\";va s(t,e,r){v i in l(t,e){ret c(t){retur in t)return s;s=e+\"0\"i \\xb1 strict\";va s(t,e,r,o) u(t,e,r,a) null;var i=1/0;var a=-1/0;var strict\";va b.tickvals I(n){var D(n){var strict\";va strict\";va strict\";va f(e,r){var e>0&&void converged strict\";va strict\";va strict\";va strict\";va c(t,e){for strict\";va strict\";va data invalid for the specified inequality many contours, clipping at strict\";va loop in contour?\") g(t){retur to newendpt is not vert. or perimeter strict\";va if(h){var void strict\";va strict\";va strict\";va strict\";va u(r,i){ret strict\";va k(t){retur to newendpt is not vert. or perimeter strict\";va a=!1;funct o(r,a){ret s=0;s strict\";va left\",\"top center\",\"t strict\";va strict\";va strict\";va l(r,a){ret strict\";va strict\";va r(r,a){ret strict\";va u(){var scale is not scale is not strict\";va iterated with no new in strict\";va void i.error(\"E hovering on heatmap, pointNumbe must be [row,col], didn't converge strict\";va l=0;la){va d(t,e,r){v P(t,e){ret if(k>0){va F(t,e,r,n) strict\";va strict\";va strict\";va strict\";va p(t){for(v f}function f(t,e,r,i) v(t){retur strict\";va strict\";va h(r,i){ret strict\";va u(t,e,r,i) t.color}); -1px 1px 2px, \"+I+\" 1px 1px 2px, \"+I+\" 1px -1px 2px, \"+I+\" -1px -1px 0, f(t){retur t.key}func h(t){var p(t,e){ret \"Courier New\", \"));var \"Courier New\", \\u2229 \"+p+\"): | color): | \"+p+\"): \"Courier New\", r=[];retur S(t){var z(t){for(v \";return P(t){var u in f(t){var h=new y,x=new strict\";va strict\";va strict\";va c(t,e){ret f(t,e){ret w(t,e){ret T(t){for(v strict\";va strict\";va o(t){retur traces support up to \"+u+\" dimensions at the strict\";va highp GLSLIFY 1\\n\\nattri vec4 p0, p1, p2, p3,\\n p4, p5, p6, p7,\\n p8, p9, pa, pb,\\n pc, pd, vec4 pf;\\n\\nuni mat4 dim1A, dim2A, dim1B, dim2B, dim1C, dim2C, dim1D, dim2D,\\n loA, hiA, loB, hiB, loC, hiC, loD, vec2 resolution sampler2D sampler2D mask;\\nuni float vec2 vec4 unit_1 = vec4(1, 1, 1, 1);\\n\\nflo val(mat4 p, mat4 v) {\\n return v) * unit_1, axisY(\\n float x,\\n mat4 d[4],\\n mat4 dim1A, mat4 dim2A, mat4 dim1B, mat4 dim2B, mat4 dim1C, mat4 dim2C, mat4 dim1D, mat4 dim2D\\n ) {\\n\\n float y1 = val(d[0], dim1A) + val(d[1], dim1B) + val(d[2], dim1C) + val(d[3], dim1D);\\n float y2 = val(d[0], dim2A) + val(d[1], dim2B) + val(d[2], dim2C) + val(d[3], dim2D);\\n return y1 * (1.0 - x) + y2 * x;\\n}\\n\\nc int bitsPerByt = 8;\\n\\nint mod2(int a) {\\n return a - 2 * (a / 2);\\n}\\n\\n mod8(int a) {\\n return a - 8 * (a / 8);\\n}\\n\\n zero = vec4(0, 0, 0, 0);\\nvec4 unit_0 = vec4(1, 1, 1, 1);\\nvec2 xyProjecti = vec2(1, 1);\\n\\nmat mclamp(mat m, mat4 lo, mat4 hi) {\\n return lo[0], hi[0]),\\n clamp(m[1] lo[1], hi[1]),\\n clamp(m[2] lo[2], hi[2]),\\n clamp(m[3] lo[3], mshow(mat4 p, mat4 lo, mat4 hi) {\\n return mclamp(p, lo, hi) == p;\\n}\\n\\nb mat4 d[4],\\n mat4 loA, mat4 hiA, mat4 loB, mat4 hiB, mat4 loC, mat4 hiC, mat4 loD, mat4 hiD\\n ) {\\n\\n return mshow(d[0] loA, hiA) &&\\n mshow(d[1] loB, hiB) &&\\n mshow(d[2] loC, hiC) &&\\n mshow(d[3] loD, d[4], sampler2D mask, float height) {\\n bool result = true;\\n int float valY, valueY, scaleX;\\n int hit, bitmask, valX;\\n for(int i = 0; i T(t,e,r){v E(t,e){ret 255, 255, 0)\");var E(t,e){for 1px 1px #fff, -1px -1px 1px #fff, 1px -1px 1px #fff, -1px 1px 1px strict\";va 0===l[s]){ i(t,e,r){v strict\";va left\",\"top center\",\"t right\",\"mi left\",\"bot left\",\"top center\",\"t right\",\"mi left\",\"bot strict\";va strict\";va u(t){retur f(t,e){var c;var g(t,e){ret v(t,e){var m(t,e){var y(t,e){var x(t){var r}function b(t,e){for strict\";va strict\";va strict\";va strict\";va 0, for(r=new P=!1;retur r=c(e);ret strict\";va r(r,a){ret p(r,a){ret m(t,e){ret _(t,e){ret strict\";va y(){var t=.5;retur \"+i.target 0 0 1 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 \"+i.target 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 1 0 0 1 \"+c+\",\"+h+ \"+l+\",\"+f+ k(t){retur 0 0 1 0 0)\":\"matri 1 1 0 0 0)\")}funct M(t){retur S(t){retur 0 0 1 0 0)\":\"matri 1 1 0 0 0)\"}functi E(t){retur 1)\":\"scale 1)\"}functi C(t){retur L(t){retur O(t,e,r,a) o(){var strict\";va v(r,i){ret strict\";va strict\";va strict\";va H(e){var q(t,e,r,n) st(t,e){re 0)}functio s}}functio ct(t){var strict\";va strict\";va strict\";va y(t){retur $(t){retur J(t){retur K(t){retur Q(t){retur t.id}funct Q}function scatter strict\";va y(t){retur x(t,e){ret b(t){retur p[t]}funct o=0;o=0){v o}function k(t,e){var = strict\";va strict\";va f(t,e){var strict\";va d(r,i){ret 1/0;var u(t){retur strict\";va s=a[0];if( a;var strict\";va c=l[0];if( l;var strict\";va strict\";va strict\";va strict\";va strict\";va d(t){var g(t,e){var g(t,e){var h.remove() d}}(t);ret x(t,e){var r;return k(t){retur T(t,e,r){v A(t,e,r){v r;return strict\";va o(t,e,r){v strict\";va or strict\";va strict\";va void void r}function strict\";va t/E*m});va r=A[t];ret strict\";va r&&r.match strict\";va s(r,a){ret m=(\" function() e,r})}func I(t,e){for r}function D(t,e){for g(t){retur strict\";va a=!1;funct o(r,a){ret strict\";va e=h(t);ret e=h(t);ret e=h(t);ret e=h(t);ret = 'local'}; if {font: if (typeof require !== 'undefined { paths: { 'plotly': } }); { window._Pl = Plotly; }); } In\u00a0[3]: def payload): url = response = requests.g url=url, params=pay ) if == 200: # - success!') return else: return None In\u00a0[4]: payload = { 'metrics': 'PriceUSD, 'TxTfrValU 'start': '2016-01-0 } # PriceUSD and TxTfrValUS are not utilised yet. Because the work needs to be expanded in order to be complete, I will keep them here for now. asset_list = ['btc', 'ltc', 'bch', 'bsv', 'doge'] data = {} for asset in asset_list data[asset = payload) In\u00a0[5]: dataframes = {} cols = ['PriceUSD 'TxTfrValU for asset in data.keys( values = [ each['valu for each in index = [ each['time for each in df = columns = cols) df.index = for col in df.columns df[col] = # create new fields df['TxCoun = df.TxTfrVa / = / = df In\u00a0[6]: # take a look at the wrangled data: Out[6]: .dataframe tbody tr { middle; } .dataframe tbody tr th { top; } .dataframe thead th { text-align right; } PriceUSD TxTfrValMe TxTfrValUS TxCount 2019-07-14 131.077983 1198.88905 0.879414 1.194010e+ 99593.0000 1363.28177 2019-05-29 193.318255 11679.3278 39.889202 5.360111e+ 45894.0000 292.794223 2019-03-28 63.633667 4749.33232 2.107293 6.684210e+ 14074.0000 2253.75966 2018-12-06 104.833018 47784.4041 25.002851 4.480266e+ 9376.00000 1911.15821 2019-08-20 145.397527 403.290224 1.017783 4.721077e+ 117063.999 396.243941 Out[6]: .dataframe tbody tr { middle; } .dataframe tbody tr th { top; } .dataframe thead th { text-align right; } PriceUSD TxTfrValMe TxTfrValUS TxCount 2018-09-17 6259.37814 6375.29005 63.354108 3.529992e+ 553699.0 100.629466 2018-03-08 9334.33005 13564.9724 115.393462 6.573545e+ 484597.0 117.554082 2018-09-18 6341.49459 6596.25081 65.237872 3.774480e+ 572216.0 101.110760 2017-10-18 5577.94935 12797.5675 68.015674 1.002400e+ 783274.0 188.156154 2018-01-21 11392.3086 13715.3752 161.642277 8.149868e+ 594214.0 84.850173 Out[6]: .dataframe tbody tr { middle; } .dataframe tbody tr th { top; } .dataframe thead th { text-align right; } PriceUSD TxTfrValMe TxTfrValUS TxCount 2018-02-17 0.007081 2667.70915 1.382312 1.623514e+ 60858.0 1929.88930 2019-02-26 0.001952 353.988959 0.807753 2.240715e+ 63299.0 438.239326 2019-03-22 0.002017 200.227768 0.443784 1.414069e+ 70623.0 451.182435 2018-08-14 0.002247 184.665415 0.516715 1.255374e+ 67981.0 357.383769 2017-04-09 0.000385 235.941139 0.234514 9.458172e+ 40087.0 1006.08471 Compare daily mean and median USD transactio value for BTC since January 2016\u00b6 In\u00a0[7]: btc_mean = go.Scatter name='BTC mean', ) btc_median = go.Scatter name='BTC median' ) data = [btc_mean, btc_median layout = go.Layout( title=\"BTC median and mean transactio values by day\", value'), ) fig = layout=lay py.iplot(f Out[7]: The chart above shows\u00a0that the daily mean transactio value is higher than the daily median The two averages are\u00a0correl From 2016 to present, the mean is approximat 2 orders of magnitude higher than the median. This relationsh appears to be consistent across the previous 4 years. Note: During the last 4 years, the USD value of 1 BTC has increased from ~400USD to currently ~10000USD. The impact of the changing USD price of the coins on mean and median should be investigat This could be easily achieved using the TxTfrValUS and PriceUSD metrics. We could also then calculate number of Plot the ratio of daily mean to median USD transactio values for each asset since January 2016\u00b6 In\u00a0[8]: def name): return go.Scatter name=name ) data = asset) for asset in layout = go.Layout( title=\"Rat of daily mean to median transactio value\", ) fig = layout=lay py.iplot(f Out[8]: Note: Except for BTC, the time series above are very \u201cchoppy\u201d. If this chart were to be shown to clients I would consider smoothing the time series by using e.g. 7 day moving average. However, this might obscure some features of the data so for the initial data exploratio I will not chart above shows that BTC has the lowest ratio of mean to median daily transactio value. This suggests that compared to the other blockchain in this has relatively strong organic\u00a0us is less influenced by\u00a0whales. MMR has lower Point 3 suggests a wide and regular user base and total daily transactio volumes should be analysed across the 5 chains to futher strengthen or rebutt Using this ratio as a proxy to measure organic use, the chain with the second most organic use is\u00a0Litecoi Since the start of 2019, the influence of whales on the Dogecoin network has Of the two contentiou hard forks, Bitcoin Cash shows two distinct phases with different in\u00a0each: From its inception in August 2017 to November 2018, the influence of whales increased at a steady rate. At the coins genesis, there appears to have been a large organic user base transactin daily, bringing the median transactio value to within 50 - 100x the mean daily transactio value. This was lower than Bitcoin\u2019s, which had a much more consistent but higher MMR of 120 -\u00a0200. After November 10 2018, the ratio increases from an average of approximat 500 to approximat 10,000. This is a stark and abrupt change in the daily ratio, and suggests that\u00a0eithe organic use drasticall decreased, BCH very suddenly started being used to facilitate very large value transfers by relatively few\u00a0users. As of January 2019, Dogecoin appears to have more widespread organic use than either BCH or BSV, despite its status as a \u201cjoke\u201d blockchain However DOGE has had a higher MMR than BTC or LTC in\u00a02019. Next Steps\u00b6This brief investigat was developed over the course of an afternoon, in line with the project brief recommendi only 4 hours of work. In order to be applied in a commercial context, this analyis should be expanded and tested in at least the Test if the central assumption of this analysis is true. Possible approaches could\u00a0incl Removing exchange outflows from the data. Could this be done using known exchange addresses (exchanges aggregate organic retail Quantifyin the influence of \u201cchange\u201d transactio - in aggregate this should be nil for day-to-day \u201ccash\u201d transactio but for whales moving the entire balance of an address there would be no \u201cchange\u201d amount. Depending how the metric is calculated this may or may not For BTC and LTC, are the lightning networks distorting the results by hiding organic low For BTC, is the liquid sidechain hiding the activity of whales to the extent that it is not the \u201chealthies of the 5 Can we infer where the whales and \u201cnormal\u201d users live, by analyzing the time of transactio People are much more likely to make a transactio at midday than midnight, and we could use this to investigat geographic clustering e.g. Is BTC a \u201cwestern\u201d chain, whilst BCH has more organic use in\u00a0Asia? An analysis of daily transactio volume (in USD terms) would be essential to this analysis. It would provide a context in which to interpret the significan of difference between each chain and difference bewtween time\u00a0frame Similarly, comparing the hash power dedicated to mining new blocks on each chain would indicate commercial interests, and abrupt changes in hash power could possibly be correlated with changes in mean-media ratio (MMR). if { var mathjaxscr = = = = ? \"innerHTML : \"text\")] = + \" config: + \" TeX: { extensions { autoNumber 'AMS' } },\" + \" jax: + \" extensions + \" displayAli 'center',\" + \" displayInd '0em',\" + \" showMathMe true,\" + \" tex2jax: { \" + \" inlineMath [ ['$','$'] ], \" + \" displayMat [ ['$$','$$' ],\" + \" true,\" + \" preview: 'TeX',\" + \" }, \" + \" 'HTML-CSS' { \" + \" linebreaks { automatic: true, width: '95% container' }, \" + \" styles: { .MathJax .mo, .MathJax .mi, .MathJax .mn': {color: 'black ! important' }\" + \" } \" + \"}); \"; (document. || }"},{"title":"A faster\u00a0shell","category":"Technical/Developer Tools","url":"shell.html","date":"14 February 2019","tags":"shell, unix, zsh, bash, profiling ","body":"Opening up a new shell was annoyingly slow. Not terrible, but enough to notice. Its a\u00a0niggle. I wanted to find out which components were causing the most delay, so I used time to measure how long it took to launch a shell. Even though shells might appear to be part of the low level \u2018guts\u2019 of a computer, each shell is just an executable and can be treated as\u00a0such. To measure the startup speed of your shell,\u00a0do: for i in $(seq 1 10); do /usr/bin/t $SHELL -i -c exit; done This shows that it takes 0.84 seconds to start zsh - not terrible, but not\u00a0great: You can compare the performanc of different shells by replacing $SHELL with zsh, bash, fish etc. Here are the results if I used BASH instead of zsh - 9.3x faster! (but without useful tools and plugins): Now that I can measure how long it takes to start, it would be useful to know which proccesses are causing the greatest delays. This could be done with something like zsh -xv which enables verbose output and xtrace. This creates a tonne of output, but doesnt inlcude timestamps All I really want is a summary of how much time each subprocces required to run, i.e. an order Add zmodload zsh/zprof at the start of .zshrc and zprof at the very end. Now when I start zsh I see the\u00a0follow Next steps - make run faster, or asyncronou or not at\u00a0all\u2026 Update: is the biggest cause of slow loading. Using the lazy loading option decreased loading time by\u00a00.3s"},{"title":"Bitcoin\u00a0Lightning","category":"Technical/Cryptocurrencies","url":"bitcoin-lightning.html","date":"24 January 2019","tags":"bitcoin, lightning, crypto ","body":"One of the largest obstacles (second only to privacy in my opinion) to widespread adoption of Bitcoin is its limited volume. Bitcoin cannot faciliate payments fast enough such that it could compete with Visa or\u00a0Masterc The most interestin solution to this problem is the Lightning protocol - a separate protocol that sits ontop of the Bitcoin protocol (a so-called \u2018Layer 2\u2019 solution). Lightning uses hashed time locked contracts (HTLCs) to trustlessl and privately move transactio off-chain. This allows payments to be faster, cheaper and more frequent. It also has interestin implicatio for There are a lot of resources about what the Lightning network is, why it\u2019s neccesary and how it works. There are also several good guides available about how to set up and maintain a node. I used a Raspberry Pi with an external hdd. It took a few attempts, mostly because it\u2019s my first time working with an unix operating system and I tried to move a swap file to a disk that wasnt formmated as\u00a0ext4\u2026 Anyway, I\u2019ve opened and closed some channels, connected to peers, and made some transactio I even bought some stickers You can find my node using these\u00a0deta Alias: Public Key: IP address: 85.145.183 Port: 9735 Some Lightning Lightning Interestin things to\u00a0do"},{"title":"Sync a BTC node,\u00a0quickly","category":"Technical/Cryptocurrencies","url":"sync-bitcoin-core-node.html","date":"13 October 2018","tags":"bitcoin, btc-core, blockchain ","body":"In order to run your own bitcoin node, or lightning node, you\u2019ll need to download the entire bitcoin blockchain and then validate it. This takes ages on a magnetic disk due to the random access speed of the contents of the To remove this bottleneck move the chainstate directory to an SSD (its only a few GB) and symlink to it from the bitcoin data directory. More details are on the Bitcoin wiki. When the sync is complete, replace the symlink with the actual"},{"title":"Bakke-Rij","category":"Non-technical/Entrepreneurship","url":"bakkerij.html","date":"11 June 2018","tags":"coworking, netherlands, haarlem ","body":"I recently began working in a new coworking space, sharing an office unit with another entreprene What makes this coworking space unique is it\u2019s conversion from an industrial bakery to a coworking space. Where the machinery once stood, converted shipping containers with glass walls have become offices for startups The skylights, glass walls and bright colors create a light and airy atmosphere and the space seems popular with designers and founders working in creative industries It\u2019s an energetic space with a creative feel to\u00a0it."},{"title":"Prediction\u00a0Markets","category":"Technical/Cryptocurrencies","url":"prediction.html","date":"7 June 2018","tags":"gnosis, prediction, markets, brexit, betting ","body":"Predicting port traffic using Background A prediction market allows people to bet on an unknown future event. For example, \u201cWhat will the Euro-Dolla exchange rate be on date\u00a0X?\u201d. In several common forecastin scenarios, prediction markets have been more accurate than polls, expert opinions, and statistica methods1 and therefore prediction models are useful for observers (anybody who is interested in the outcome) and not just the market\u2019s participan Prediction markets can be used for categorica events (a specific event that either does or doesn\u2019t happen) or scalar events (when the outcome is between a range of values). The predefined source of truth for the outcomes being predicted is called an\u00a0Oracle. Prediction markets enables participan to purchase shares or tokens tied to the outcome of a specific future event. Once the event has occurred, holders of the tokens representi the actual outcome will receive a reward of predefined value. This creates an incentive to hold tokens correspond to the correct outcome, and the market dynamics of supply and demand allow the price to reflect perceived probabilit of different outcomes. Helpfully, the price of each type of token correspond to the relative probabilit of each outcome occurring which allows for simple interpreta of the results. If the reward for holding a share correspond to the correct outcome is $1, and the present price of this share is 50 cents, then the market\u2019s estimate of the likelihood of this outcome occurring is\u00a050%. Shares can be traded continuous As trading occurs over time the probabilit of different outcomes will change as new informatio becomes known, and the changing price of the shares quantify\u00a0t An\u00a0example Let\u2019s describe how this could work with an\u00a0example There is a large degree of uncertaint around how Britain will continue to trade with the other European countries when it exits the EU on March 29, 2019. A prediction market will likely be a better predictor of the outcome than any other\u00a0meth If an adequate agreement isn\u2019t achieved, Britain\u2019s main port at Dover will certainly experience long delays and large traffic jams. Therefore a prediction market asking \u201cHow many vehicles will be admitted into Britain at the port of Dover between 00:00 and 23:59 on March 29 2019?\u201d will give a useful prediction about the outcome of Britain\u2019s trade negotiatio - a key component and sticking point in the Each type of token in the prediction market will correspond to different quantities of vehicles entering the port3 - for example there could be four (or more) categories Less than 8000, between 8000 and 11000, between 11000 and 14000, and more than 14000. The relative price of a share in each category will correspond to the relative probabilit of each If the number of vehicles is lower than would otherwise be expected, (around 12000) this would likely be due to the impact of Brexit and thus the market will serve as a useful proxy for predicting what kind of Brexit will\u00a0occur Stakeholde incentives are\u00a0aligne One reason that prediction markets work so well is because they aggregate informatio from disparate sources and the price shows not only an impartial assessment of the most likely outcome but the aggregated level of confidence that the participan have. Since no one is obliged to participat those that do believe they have valuable informatio which gives them a competitiv advantage. This creates a mechanism that moves good quality informatio into the prediction market, with the resulting prices reflecting the probabilit of a range of\u00a0outcome In our example, people with relevant informatio would include port employees, business owners in the UK and in Europe, politician civil servants, business analysts, bankers, etc. Whilst it is clear that any of these roles may have useful informatio or judgement about the outcome, it is not clear how useful each participan role is relative to the others. By allowing a participan to bid for as many shares in as many different categories as they want, each participan confidence in their informatio can be\u00a0quantif In this way, prediction markets align the incentives of market participan and observers. If the market is large enough then it becomes prohibitiv expensive to distort the market and promote poor quality informatio including informatio designed to create FUD (Fear, Uncertaint Doubt), fake news, or Prediction markets are nothing new, with political betting being used to make prediction as early as the 1500s. However the adoption of the internet and decentrali networks now allow prediction markets to be used more widely and cheaply than ever before. Gnosis is building a platform on the Ethereum blockchain on which others can build new applicatio which harness the power of By lowering the cost and complexity of creating a prediction market, observers can benefit from high quality and impartial predictive informatio about future events, and market participan are rewarded for accurate assessment of likely outcomes. This will enable better decision making and empower observers with previously unobtainab insight. In our example, the market could be created and funded by any organisati that would benefit from knowing the results of the prediction market. This could be the port of Dover itself, news organisati or market research firms. Any of these businesses would benefit from an K. J. Arrow, R. Forsythe, M. Gorham, R. Hahn, R. Hanson, J. O. Ledyard, S. Levmore, R. Litan, P. Milgrom, F. D. Nelson, G. R. Neumann, M. Ottaviani, T. C. Schelling, R. J. Shiller, V. L. Smith, E. Snowberg, C. R. Sunstein, P. C. Tetlock, P. E. Tetlock, H. R. Varian, J. Wolfers, and E. Zitzewitz. The promise of prediction markets. Science, 320(5878), 2008.\u00a0\u21a9 For example Data from"},{"title":"Reading: April\u00a02018","category":"Non-technical/Learning","url":"reading-april-2018.html","date":"4 June 2018","tags":"reading, learning ","body":"Articles Mental models A comprehens list organised around different discipline It a long read but probably one of the most easy to apply and tangibly useful articles I\u2019ve read\u00a0recen Life is short - Is life actually short, or would we always want more time, no matter how much we could\u00a0have Invisible asymptotes - A well written and long look at various facets of Factors from scratch - Investing: A unified framework to explain how factors\u00a0wo This is tragic - \u201c\u2026But I do think this highlights the potential disconnect between mental health & business, publicity & success, and success & happiness. The internet can seem so intimate but ultimately it\u2019s a thin view of an individual or Vim after 15\u00a0years Cryptocurr regulation around the\u00a0world On the 2008 financial crisis - \u201cI\u2019m not sure if it\u2019s possible for an action to be both necessary and a disaster, but that in essence is what the Podcasts Jill Carlson on the \u201cWhat Bitcoin Did\u201d podcast. \u201cFor the first time there is no longer a monopoly on the creation of value or monetary systems, that is really What Bitcoin\u00a0Di Georges St-Pierre on the Joe Rogan podcast. A long and candid conversati from one of the world\u2019s best\u00a0athle Ricardo Spagni a.k.a Fluffypony on Monero vs Bitcoin, EOS, the current bear market, Tari, and\u00a0ASICs. Resources Lots of data\u00a0sets"},{"title":"Ry\u2019s Git\u00a0Tutorial","category":"Technical/Developer Tools","url":"rys-git-tutorial.html","date":"1 June 2018","tags":"git, rys, tutorial, ryan hodson ","body":"For tracking changes to a collection of files, Git is the ubiquitous solution. It\u2019s free, robust, comprehens and there is a plethora of resources that are easy to find. I usually find the commands difficult to remember though, and the concepts which Git is built on often seem to me. This means I spend a lot of time searching for answers and trying to remember how I can use Git to experiment with a project without fear of losing any hard-won progress. Ry\u2019s Git Tutorial by Ryan Hodson is the best way to learn Git that I have come across. Its simple, practical, and clear. The reader learns how to use Git by creating and maintainin a simple website1 . This gives the Git commands a meaningful context, which makes them a lot easier to remember and use in the future. The tutorial was first published in 2012 and the website which originally hosted the examples no longer exists. Each tutorial chapter starts with a link to download the project files up to that point, so the reader doesn\u2019t need to start at the beginning but can jump into the guide at any point. Unfortunat these links no longer work and I\u2019m going to make the materials available here so that they can continue to be useful. If the author would like to get in touch, please do. I\u2019d like to keep this great resource available so that others can benefit from it. .epub or .pdf versions are available to download. Download the example files for each module below: Chapter 2: Undoing Changes Chapter 3: Branches I Chapter 4: Branches II Chapter 5: Rebasing Chapter 6: Rewriting History Chapter 7: Remotes Chapter 8: Centralize Workflows Chapter 9: Distribute Workflows Chapter 10: Patch Workflows Chapter 11: Tips & Tricks Chapter 12: Plumbing end This tutorial was also the first time I created a simple website, and it led to so many \u201cahah!\u201d moments. It unlocked all the web developmen progress that followed. \u21a9"},{"title":"The Bitcoin Lightning\u00a0Network","category":"Technical/Cryptocurrencies","url":"lightning.html","date":"12 April 2018","tags":"bitcoin ","body":"The lightning network is a protocol that operates on top of the Bitcoin network. It allows instant transactio between participan and is the leading solution to current This post is based on the excellent seminar by Joseph Poon and Thaddeus Dryja \u201cScaling Bitcoin to Billions of Transactio Per Day\u201d which was given at the San Francisco bitcoin developers conference in early 2015. You can see the seminar here. The\u00a0Proble Bitcoin doesn\u2019t scale well enough to facilitate the rate of transactio necessary for it to become a medium of exchange in everyday life. This is\u00a0because Transactio aren\u2019t instant - the average block time is 10\u00a0minutes Transactio fees are variable and too high (particula when blocks are almost full) to enable low value Currently, Bitcoin has a 1MB block size limit. This allows about 2750 transactio per block (link) or 4.6 transactio per second. This isn\u2019t fast enough for a global Solutions Bigger\u00a0blo In 2017 there was a lot of contentiou debate about how to solve Bitcoins scaling problems. One of the most frequently suggested solutions is to increase the block size, so that more transactio can fit into each\u00a0block If you had 7 billion people making 2 transactio per day, you would need 24GB blocks, generating data at a rate of 3.5TB/day. This would make running a full bitcoin node impractica for many people which would result in fewer miners and (Note: you would expect corporatio and large miners to support efforts to increase block sizes, because the associated infrastruc cost increases create a higher barrier for entry to newcomers. This would decrease the competitio for new blocks and protect their revenue from miners\u00a0fee More SQL is a much more efficient database model than a blockchain Its scalable and fast, and is what is used today to power visa, master card, central banks. But it isn\u2019t trustless. With a SQL database model you have a trusted 3rd party maintainin the database which everyone else needs to query to discover or verify a balance. This is equivalent to giving the 3rd party your money and trusting they do the right\u00a0thin Side\u00a0chain A blockchain with other blockchain running parallel to it. Maybe like a a rope made of many different chords. Side chains are not primarily a scaling solution. If you want to send a payment to an address that is on a different side chain you would create 2 transactio Many payments between two predetermi parties. Useful when two parties pay each other multiple times, not necessaril good for paying many different accounts relatively few times each. Its What we want is payments from anyone to\u00a0anyone Payment channels between many parties in a multi-hop hub and spoke\u00a0mode Minimally trusted intermedia - they can\u2019t take your coins, but they could conceivabl Requires the malleabili fix that occurred in\u00a02017 Previous soft\u00a0forks Pay to script\u00a0has bip34 What are Uses\u00a0multi Allows two people to send transactio to each other quickly without hitting the A 2 party unidirecti Alice and Bob create a multisig address that they each\u00a0contr Alice wants to send 1 BTC to the Before she does this, she gets Bobs refund signature, this means that at worst, Alice loses her coin for 30\u00a0days. Bob creates a 30 day nLock time refund signature, signs it, and sends it to\u00a0Alice. Alice can either sign it immediatel and keep it, or wait to sign it herself (keeping Once Alice has the refund signature she knows its safe to send her BTC to the multisig address she and Bob just\u00a0creat"},{"title":"How to buy\u00a0Bitcoin","category":"Technical/Cryptocurrencies","url":"buying-btc.html","date":"7 December 2017","tags":"bitcoin ","body":"Recently a few friends have asked me how they can buy Bitcoin. I\u2019m not a financial advisor, but here are a few things that come to\u00a0mind: Don\u2019t invest what you can\u2019t afford to\u00a0lose. If the price falls 50%, you need to be able to wait whilst the Write down the\u00a0follow How much can you afford to invest? Consider how much cash you will need over the next year, how long it would take to recover any losses,\u00a0et How long do you want to invest\u00a0for How much profit do you want to make? Don\u2019t have an You will need to get good at identifyin and ignoring the following, even from your\u00a0frien Hype Fake\u00a0news Fear, Uncertaint Doubt (FUD) Don\u2019t trust The previous point is really important, and By putting something you care about in a risky situation (it is risky) you will experience anxiety and excitement You need to control your psychology and identity when other people are trying to manipulate you. This is useful in all areas of\u00a0life. Investing is good mental exercise because money has an intensely psychologi quality about it, and crypto currencies are the most intense trading experience there is right\u00a0now. Almost no one has a clue what\u2019s happening. Convention economists and traders certainly don\u2019t. This is and the rules haven\u2019t been worked out yet. We\u2019ve never had this tech before, and the internet has made everything - communicat and innovation - much quicker. This is a powerful combinatio of factors and we haven\u2019t seen them play out\u00a0before Take some time to read about the fundamenta and understand the tech as much as you can. Think about why people would behave in certain ways and what makes bitcoin useful, or not. Get started on this today, it will take some\u00a0time. YouTube has a tonne of videos, and these two sites are good and detailed: lopp.net Twitter has a lot of current and new informatio but also a lot of bots and scammers spreading hype and FUD (see point\u00a04). If you\u2019re going to lose sleep over your investment invest\u00a0les Don\u2019t buy at an all time high. Do your research, wait for the price to correct. Prices fall. The chart at the top of the page shows the 20 day and 55 day moving average compared to the daily price. My opinion is that by early January the price will have returned to between the 20 day and 55 day average before beginning to rise\u00a0again Coinbase is a user friendly and reputable exchange, there are other good exchanges\u00a0 Don\u2019t make financial decisions when you\u2019re feeling rushed. Check If you\u2019ve bought some Bitcoin or other don\u2019t store them on an exchange. Transfer them to a wallet that you control. If you don\u2019t own your private key, you don\u2019t own the asset. If you don\u2019t know what that means, google it (Point\u00a08)."},{"title":"Live near the\u00a0ocean","category":"Non-technical/Journal","url":"ocean.html","date":"23 November 2017","tags":"water ","body":"California Dorset Acapulco Dublin"},{"title":"Pangea","category":"Technical/Cryptocurrencies","url":"pangea.html","date":"22 November 2017","tags":"bitnation, pangea ","body":"The\u00a0proble In many countries the ability to create legally binding agreements is not available to average citizens. Legal services are often unaffordab and opaque, or service providers are corrupt. Legal services are in need of\u00a0disrupt The\u00a0soluti Pangea Bitnation (who I consult for) intends to address this problem by empowering people to self-organ and self-gover We are building a platform called Pangea which allows users to create, notarise and arbitrate contracts according to a jurisdicti which each party joins voluntaril irrespecti of their Pangea is a smart phone app that looks and feels like a chat app, the back-end (called Panthalass is an encrypted mesh network hooked up to the Rewarded for doing good, empowered to be a On Pangea, people are incentivis to be good citizens by receiving rewards for doing good, rather than being coerced by the threat of punishment for bad\u00a0behavi This platform would fulfil a vast and unmet need, particular in countries whose legal systems function poorly. On Pangea, a user voluntaril chooses which jurisdicti to be a part of. Contracts are then notarised, executed and arbitrated according to that jurisdicti Users voluntaril join a decentrali borderless voluntary nation (DBVN) and will receive tokens (Pangea Arbitratio Tokens) as a reward for good behaviour. The tokens will be tradable and will be used as payment on Pangea for notarisati and Combining a store of value and access to legal\u00a0serv Societies cannot escape their need to use currency as stores of value. They also cannot escape their need to create reliable and enforceabl agreements (contracts with each\u00a0other Generating Bitcoin through proof of work occurs because individual believe that Bitcoin will be continuous used \u2014 that it will meet an ongoing need to transact using a decentrali and Generating PAT by being a good citizen will occur because individual believe that Pangea will be continuous used \u2014 that it will meet an ongoing need to create enforceabl agreements using a voluntary and geographic agnostic (and Comments There is much still to say about the Pangea platform and the mechanisms which will make it function. Please give feedback and ask questions in the comments. To find out more, visit the website."},{"title":"Bitcoin compared to\u00a0gold","category":"Technical/Cryptocurrencies","url":"bitcoin-vs-gold.html","date":"21 November 2017","tags":"bitcoin, gold, safe haven ","body":"A safe haven asset is something to buy during economic uncertaint Historical the safest asset you can buy has been gold. This is not because of anything inherently special about gold, but because that is what people believe to be the best long term method of storing\u00a0va People believe gold is special because they assume that in future other people will believe it\u2019s\u00a0speci Criteria for a safe haven\u00a0asse A safe-haven asset must fulfil the Price isn\u2019t controlled by any single party, including a state or bank. The market is spread out beyond the reach of any one organisati This is important because an asset which is issued, controlled or backed by an organisati has its value tied to the health of Supply isn\u2019t controlled by any single party, including a state, bank or anyone else - it exists naturally and the rate at which it\u2019s produced or traded is beyond the control of any Supply is limited. The effort required to create the asset naturally limits the\u00a0supply The asset doesn\u2019t wear out or\u00a0expire. It\u2019s prohibitiv expensive to\u00a0fake. Almost everyone considers it to be precious and\u00a0valuab It can be stored and transporte simply. It\u2019s not delicate or\u00a0volatil For these reasons, and because of historical consensus, people have been happy to use gold as a store of value in times of economic uncertaint or for long durations. Other assets also meets these requiremen to Bitcoin compared to\u00a0gold Consider why gold is so good as a safe haven asset and long term value store. For all the reasons above, bitcoin is better, except one: At present, not many people consider it to be precious and valuable, so the market is small. This will change as confidence and awareness increases, and the eco-system of services and The fundamenta are A decentrali network ensures that Bitcoin can\u2019t be regulated or manipulate by any single government or organisati The Bitcoin network can\u2019t be turned\u00a0off The present and future rate of supply is publicly available and unchangeab This increases market efficiency and creates more rational pricing than a market where the rate of supply Supply is naturally limited using proof of Bitcoin doesn\u2019t corrode or wear\u00a0out. Bitcoin is impossible to\u00a0fake. Bitcoin can be stored and transporte more easily than gold. - If you can remember 24 words then you can access your bitcoin for free from any Read\u00a0more This article, published a week after I wrote this post, looks at different factors to consider when evaluating Bitcoin\u2019s value. It goes into a lot of detail, relative to what I\u2019ve seen in"},{"title":"Hardware\u00a0Wallets","category":"Technical/Cryptocurrencies","url":"wallet.html","date":"8 November 2017","tags":"wallet, crypto, blockchain, bitcoin ","body":"What is a A hardware wallet (HW wallet) is a physical device that stores the informatio required to access digital currency or assets. It is plugged into a computer via USB in order to initiate or confirm transactio on the Bitcoin, Ethereum or other digital asset They are a secure method of storing cryptograp data. They are so secure that they can be used on a compromise computer. All that is needed to access funds using a HW wallet (in addition to the device itself) is a PIN code which the user chooses. A single HW wallet can store multiple currencies in HW wallets are an easier solution than rememberin a good password, and safer than storing the data in a file on my computer or\u00a0online. The best known hardware wallet brands are Ledger and Trezor. The\u00a0proble HW wallets are technicall great, but their size and shape creates a bad user experience A good hardware wallet should be convenient to use multiple times each day, like a credit card\u00a0is. Current hardware wallets don\u2019t fit into a (money) wallet and people don\u2019t want to carry more any objects in their pockets. They are too big and are a bad\u00a0shape. HW wallets look like they might belong on a keyring, but it\u2019s inconvenie and insecure to attach a credit card to a keyring and the same is true for an HW wallet. I might store my keys on a hook by my door, but I would never leave my wallet there overnight. I often want to keep my keys and money separate because I need my keys when I\u2019m near my house, where I don\u2019t need to buy stuff, and I need my money when I\u2019m away from my house where I don\u2019t need to I want to keep my bank cards and cash together in one safe place, and I don\u2019t want to carry around a dongle as well. It\u2019s easier to have a separate dongle lost or stolen than something that would fit next to my credit card in my wallet. The inconvenie is a barrier to enjoying the advantages of The\u00a0goal Create a HW wallet that is the size and shape of a credit card, it could be 3 times thicker than a credit card and still fit in a normal wallet. It\u00a0needs a display - it could be a low resolution b&w\u00a0displa two or more\u00a0butto to plug into a USB\u00a0port securely sign transactio and"},{"title":"Trading digital\u00a0assets","category":"Technical/Data","url":"algo-trading.html","date":"28 October 2017","tags":"bitcoin, litecoin, ethereum, finance ","body":"Table of and import data3\u00a0\u00a0For data4\u00a0\u00a0Ass - \u00a34.2\u00a0\u00a0Ethe - \u00a34.3\u00a0\u00a0Ethe - - \u00a34.5\u00a0\u00a0Lite - BTC5\u00a0\u00a0SMA gains w/ different SMA through time for one combinatio of sma1 and sma26\u00a0\u00a0Nex steps: In\u00a0[1]: from import HTML function code_toggl { if (code_show } else { } code_show =! code_show } $( document This analysis was made using Python. If you'd like to see the code used, click =0);v t&&t>=0);v new a(1);for(v t&&t>=0);v works only with positive a(0),mod:n a(0)};var i,o,s;retu i=new a(1),o=new a(0),s=new a(0),l=new i=new a(1),o=new f;return A[t];var p;else m;else new Error(\"Unk prime \"+t);e=new g}return works only with works only with red works only with works only with red l}}}functi s(t){retur l(t,e){ret 1:return s(t);case 3:return new Invalid n(t,e,r){v \"+r);var new n(t,e){var i=\"for(var i=n[t];ret var P=C+1;PZ)t new typed array length\");v e=new e)throw new Error(\"If encoding is specified then the first argument must be a l(t)}retur t)throw new argument must not be a t instanceof t)throw new argument must be a new to allocate Buffer larger than maximum size: bytes\");re 0|t}functi instanceof 0;for(var void 0:return v(t,e,r){v n=!1;if((v new hex E(n)}funct E(t){var new to access beyond buffer new argument must be a Buffer new out of new out of a}function U(t){for(v a}function H(t){retur i}function Y(t){retur t!==t}var t=new browser lacks typed array (Uint8Arra support which is required by `buffer` v5.x. Use `buffer` v4.x if you require old browser new must be 0;for(var ... new must be a new of range 0;for(var 0)}var new to write outside buffer new encoding: new out of new Error(\"Inv string. Length must be a multiple of i(t){retur a(t){var o(t){retur i(t,e){ret a(t,e){for i(g,d,v,h) n(t){var i(t,e){for r=new s}function m(t,e,r){v i=new n(t){var strict\";va t;var new Error(f+\" map requires nshades to be at least size i(t,e,r,i) 0}return n(t,e){ret t-e}functi i(t,e){var 0:return 0;case 1:return t[0]-e[0]; 2:return 3:var i;var 4:var n(t){var t}function i(t){retur a(t){retur o(t){retur null}var null;var a}return a}return i(t){var e=new i=0;i0)thr new Error(\"cwi pre() block may not reference array new Error(\"cwi post() block may not reference array args\")}els new Error(\"cwi pre() block may not reference array new Error(\"cwi post() block may not reference array index\")}el new Error(\"cwi Too many arguments in pre() new Error(\"cwi Too many arguments in body() new Error(\"cwi Too many arguments in post() block\");re n(t,e,r){v l(t,e){for w=new cwise routine for new return n(t){var e=[\"'use strict'\",\" function (!(\"+l.joi && \")+\")) throw new Error('cwi Arrays do not all have the same {\"),e.push (!(\"+u.joi && \")+\")) throw new Error('cwi Arrays do not all have the same i(t,e,r){v m,v=new t;var e=[];for(v r in e=[];for(v r in e=[];for(v r in n&&void e(t,e){var n in r}function r(){}funct n(t){var e;return y(t){retur b(t){retur new Error(\"unk type: i(t,e){for r,n,i=new i(){if(s){ t(t){var r(t){var Array(o),l this;var 1:do{o=new 2:do{o=new 3:do{o=new t=[];retur s(){var l(){for(va a(t){retur n}}}functi l(t){retur u(t){for(v e}function c(t,e){for r in p(t){retur f(t)in this._&&de v(){var t=[];for(v e in t}function g(){var t=0;for(va e in t}function y(){for(va t in x(t){retur t}function function() w(t,e){if( in t)return Z(t,e){ret J(t,e){var K(t){var vt(t){retu gt(t){retu yt(t){retu _t(t){retu wt(t){retu kt(t,e,r){ Ot(){for(v t}function Ft(){for(v Nt(t){var b=u&&h;ret Bt(t){retu t+\"\"}funct n(e){var in e}function Gt(t,e,r){ le(t){var ce(t){for( ge(t){var ye(t,e){re we(t){var ke(t,e){re y}}functio Fe(t){retu Re(){var r=e;return r(t){var a(t,e){ret o(t,e){var l(t){for(v c(i,a){ret Je(){funct s}function 0 1,1 0 1,1 $e(){funct t(t,n){var er(){funct t(t,e){var rr(t){func s}function nr(t){func r(e){retur n(e){funct a(r,n){var k}function ir(t){var sr(t){retu t})()}func lr(t){func e(t){retur i(){return ur(t){retu r(t,e){var Vr(t,e){va n;var s;var Hr(t,e){va Vr(r,e);va Gr(t){for( r}function wn(t,e){va kn(t){retu An(t){retu 1;var Zn(t,e){va n}function bi(t){retu xi(t,e){re _i(t,e){re Ei(t){func Ni(t){retu t.y})}func Bi(t){retu Ui(t){var Vi(t){var qi(t,e){va a(t){retur o(t)}var o,s;return Qi(t,e){re $i(t,e){re a(t){retur o(e){retur t(i(e))}re _a(t){func e(e){funct Ma(t){retu ka(t){for( Aa(t){for( p[n]:delet t[r],1}var io(t){retu n(e){retur t(e)}funct i(t,r){var r};var t;var e=new b;if(t)for h(){functi f(){functi t(){var in r(){var n(){var o;if(i)ret i=!1,a;var e=new ms={\"-\":\"\" %b %e %X this.s}};v bs=new e(e,r){var t(){var e(){return }var new t(e,r,n,i) c}function e(t){for(v r}function t(t,a){var t(t,e){for i(t,e,r,n) \"+e}functi 0,0 \"+n}var t(t,i){var \"+l[2]+\" \"+l[3]}var 0,\"+e+\" \"+e+\",\"+e+ a(){functi v(){var l;var t;e||(e=t) e}function s(t){var l(t,e,r,n) u(t,e,r){v n=t;do{var n}function l=t;do{for h(t,e,r,n) r}function m(t,e,r,n) v(t){var t}function x(t,e){ret w(t,e){ret k(t,e){var A(t,e){ret n(t,e){var e){e=0;for warning: possible EventEmitt memory leak detected. %d listeners added. Use to increase must be a function\") n=!1;retur must be a e=typeof o(t,e,r,n) \"+i+\"=== typeof s(t,e){ret e.length> 1; if (a[m] === v) return true; if (a[m] > v) j = m - 1; else i = m + 1;}return false; }(\"+n+\", u(t){retur in p\"}functio h(t,e){ret c[1]){var s[e][t];va i(t){retur new a(t,e){ret new r}var 0)}functio d(t){for(v m(t){retur new t=[];retur t=[];retur 1:return 2:return new new new new this.tree; e=new i=0;i0)ret new Error(\"Can update empty node!\");va r=new new s(t){for(v z%d-%d-%d (features: %d, points: %d, simplified down to parent tile down\");var i(t,e,r){v s}function i(t,e,r,n) s(t,e){var r=new i(t);retur e(e,r,n){i in t){var U=g,V=_,k= 0.0) {\\n vec3 nPosition = mix(bounds bounds[1], 0.5 * (position + 1.0));\\n gl_Positio = projection * view * model * 1.0);\\n } else {\\n gl_Positio = }\\n colorChann = mediump GLSLIFY 1\\n\\nunifo vec4 vec3 main() {\\n gl_FragCol = colorChann * colors[0] + \\n colorChann * colors[1] +\\n colorChann * vectorizin d=new o(t,e,r,n) s;var r}function a(t,e){for r=0;rr)thr new If resizing buffer, must not specify a(t,e){for new Invalid type for webgl buffer, must be either or new Invalid usage for buffer, must be either gl.STATIC_ or t&&void new Cannot specify offset when resizing new Error(\"gl- Can't resize FBO, invalid new Error(\"gl- Parameters are too large for new Error(\"gl- Multiple draw buffer extension not new Error(\"gl- Context does not support \"+s+\" draw buffers\")} new Error(\"gl- Context does not support floating point h=!0;\"dept new Error(\"gl- Shape vector must be length 2\");var null;var 0.25) {\\n discard;\\n }\\n gl_FragCol = highp GLSLIFY 1\\n\\nattri vec2 aHi, aLo, vec4 pick0, vec2 scaleHi, translateH scaleLo, translateL float vec4 pickA, scHi, vec2 trHi, vec2 scLo, vec2 trLo, vec2 posHi, vec2 posLo) {\\n return (posHi + trHi) * scHi\\n + (posLo + trLo) * scHi\\n + (posHi + trHi) * scLo\\n + (posLo + trLo) * main() {\\n vec2 p = translateH scaleLo, translateL aHi, aLo);\\n vec2 n = width * * vec2(dHi.y -dHi.x)) / gl_Positio = vec4(p + n, 0, 1);\\n pickA = pick0;\\n pickB = mediump GLSLIFY 1\\n\\nunifo vec4 vec4 pickA, pickB;\\n\\n main() {\\n vec4 fragId = 0.0);\\n if(pickB.w > pickA.w) {\\n fragId.xyz = pickB.xyz; }\\n\\n fragId += fragId.y += floor(frag / 256.0);\\n fragId.x -= floor(frag / 256.0) * 256.0;\\n\\n fragId.z += floor(frag / 256.0);\\n fragId.y -= floor(frag / 256.0) * 256.0;\\n\\n fragId.w += floor(frag / 256.0);\\n fragId.z -= floor(frag / 256.0) * 256.0;\\n\\n gl_FragCol = fragId / highp GLSLIFY 1\\n\\nattri vec2 aHi, aLo, vec2 scaleHi, translateH scaleLo, translateL float projectVal scHi, vec2 trHi, vec2 scLo, vec2 trLo, vec2 posHi, vec2 posLo) {\\n return (posHi + trHi) * scHi\\n + (posLo + trLo) * scHi\\n + (posHi + trHi) * scLo\\n + (posLo + trLo) * main() {\\n vec2 p = translateH scaleLo, translateL aHi, aLo);\\n if(dHi.y e+n;var null;var FLOAT_MAX) {\\n return vec4(127.0 128.0, 0.0, 0.0) / 255.0;\\n } else if(v \"+t[1]+\", \"+t[2]+\", t=new e=new r=new \"+t[1]+\", n=\"precisi mediump GLSLIFY 1\\n\\nunifo vec3 float vec3 vec4 f_id;\\n\\nv main() {\\n || \\n {\\n discard;\\n }\\n gl_FragCol = vec4(pickI mediump GLSLIFY 1\\n\\nattri vec3 position, vec4 vec2 uv;\\n\\nuni mat4 model\\n , view\\n , vec3 eyePositio , vec3 f_normal\\n , , , vec4 vec2 f_uv;\\n\\nv main() {\\n vec4 m_position = model * vec4(posit 1.0);\\n vec4 t_position = view * m_position gl_Positio = projection * t_position f_color = color;\\n f_normal = normal;\\n f_data = position;\\ f_eyeDirec = eyePositio - position;\\ = lightPosit - position;\\ f_uv = mediump GLSLIFY 1\\n\\nfloat x, float roughness) {\\n float NdotH = max(x, 0.0001);\\n float cos2Alpha = NdotH * NdotH;\\n float tan2Alpha = (cos2Alpha - 1.0) / cos2Alpha; float roughness2 = roughness * roughness; float denom = * roughness2 * cos2Alpha * cos2Alpha; return exp(tan2Al / roughness2 / vec3 vec3 vec3 float roughness, float fresnel) {\\n\\n float VdotN = 0.0);\\n float LdotN = 0.0);\\n\\n //Half angle vector\\n vec3 H = + //Geometri term\\n float NdotH = H), 0.0);\\n float VdotH = H), 0.000001); float LdotH = H), 0.000001); float G1 = (2.0 * NdotH * VdotN) / VdotH;\\n float G2 = (2.0 * NdotH * LdotN) / LdotH;\\n float G = min(1.0, min(G1, G2));\\n \\n //Distribu term\\n float D = //Fresnel term\\n float F = pow(1.0 - VdotN, fresnel);\\ //Multiply terms and done\\n return G * F * D / max(3.1415 * VdotN, vec3 float roughness\\ , fresnel\\n , kambient\\n , kdiffuse\\n , kspecular\\ , sampler2D vec3 f_normal\\n , , , vec4 vec2 f_uv;\\n\\nv main() {\\n || \\n {\\n discard;\\n }\\n\\n vec3 N = vec3 L = vec3 V = \\n {\\n N = -N;\\n }\\n\\n float specular = V, N, roughness, fresnel);\\ float diffuse = min(kambie + kdiffuse * max(dot(N, L), 0.0), 1.0);\\n\\n vec4 surfaceCol = f_color * f_uv);\\n vec4 litColor = surfaceCol * vec4(diffu * + kspecular * vec3(1,1,1 * specular, 1.0);\\n\\n gl_FragCol = litColor * mediump GLSLIFY 1\\n\\nattri vec3 vec4 vec2 uv;\\n\\nuni mat4 model, view, vec4 vec3 vec2 f_uv;\\n\\nv main() {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n f_color = color;\\n f_data = position;\\ f_uv = mediump GLSLIFY 1\\n\\nunifo vec3 sampler2D float vec4 vec3 vec2 f_uv;\\n\\nv main() {\\n || \\n {\\n discard;\\n }\\n\\n gl_FragCol = f_color * f_uv) * mediump GLSLIFY 1\\n\\nattri vec3 vec4 vec2 uv;\\nattri float mat4 model, view, vec3 vec4 vec2 f_uv;\\n\\nv main() {\\n || \\n {\\n gl_Positio = } else {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n }\\n gl_PointSi = pointSize; f_color = color;\\n f_uv = mediump GLSLIFY 1\\n\\nunifo sampler2D float vec4 vec2 f_uv;\\n\\nv main() {\\n vec2 pointR = - if(dot(poi pointR) > 0.25) {\\n discard;\\n }\\n gl_FragCol = f_color * f_uv) * mediump GLSLIFY 1\\n\\nattri vec3 vec4 id;\\n\\nuni mat4 model, view, vec3 vec4 f_id;\\n\\nv main() {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n f_id = id;\\n f_position = mediump GLSLIFY 1\\n\\nattri vec3 float vec4 id;\\n\\nuni mat4 model, view, vec3 vec3 vec4 f_id;\\n\\nv main() {\\n || \\n {\\n gl_Positio = } else {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n gl_PointSi = pointSize; }\\n f_id = id;\\n f_position = mediump GLSLIFY 1\\n\\nattri vec3 mat4 model, view, main() {\\n gl_Positio = projection * view * model * vec4(posit mediump GLSLIFY 1\\n\\nunifo vec3 main() {\\n gl_FragCol = i(t){for(v null;for(v function() E=new new s(\"\",\"Inva data type for attribute \"+h+\": new s(\"\",\"Unkn data type for attribute \"+h+\": \"+f);var new s(\"\",\"Inva data type for attribute \"+h+\": n(t){retur new i(t,e){for r=new new s(\"\",\"Inva uniform dimension type for matrix \"+name+\": new s(\"\",\"Unkn uniform data type for \"+name+\": \"+r)}var new s(\"\",\"Inva data new data type for vector \"+name+\": r=[];for(v n in e){var r}function h(e){for(v n=[\"return function new s(\"\",\"Inva data new s(\"\",\"Inva uniform dimension type for matrix \"+name+\": \"+t);retur i(r*r,0)}t new s(\"\",\"Unkn uniform data type for \"+name+\": \"+t)}}func i){var p(t){var r=0;r1){l[ u=1;u1)for l=0;l=0){v t||t}funct s(t){funct r(){for(va u=0;u 1.0) {\\n discard;\\n }\\n baseColor = color, step(radiu gl_FragCol = * baseColor. mediump GLSLIFY 1\\n\\nattri vec2 vec4 mat3 float vec4 vec4 main() {\\n vec3 hgPosition = matrix * vec3(posit 1);\\n gl_Positio = 0, gl_PointSi = pointSize; vec4 id = pickId + pickOffset id.y += floor(id.x / 256.0);\\n id.x -= floor(id.x / 256.0) * 256.0;\\n\\n id.z += floor(id.y / 256.0);\\n id.y -= floor(id.y / 256.0) * 256.0;\\n\\n id.w += floor(id.z / 256.0);\\n id.z -= floor(id.z / 256.0) * 256.0;\\n\\n fragId = mediump GLSLIFY 1\\n\\nvaryi vec4 main() {\\n float radius = length(2.0 * - 1.0);\\n if(radius > 1.0) {\\n discard;\\n }\\n gl_FragCol = fragId / i(t,e){var instanceof instanceof null;var n(t,e,r,n) highp GLSLIFY 1\\n\\n\\nvec posHi, vec2 posLo, vec2 scHi, vec2 scLo, vec2 trHi, vec2 trLo) {\\n return vec4((posH + trHi) * scHi\\n \\t\\t\\t//FI this thingy does not give noticeable precision gain, need test\\n + (posLo + trLo) * scHi\\n + (posHi + trHi) * scLo\\n + (posLo + trLo) * scLo\\n , 0, vec2 positionHi float size, vec2 char, is 64-bit form of scale and vec2 scaleHi, scaleLo, translateH float vec4 sampler2D vec4 charColor, vec2 vec2 float float main() {\\n charColor = vec2(color / 255., 0));\\n borderColo = vec2(color / 255., 0));\\n\\n gl_PointSi = size * pixelRatio pointSize = size * charId = char;\\n borderWidt = border;\\n\\ gl_Positio = positionHi positionLo scaleHi, scaleLo,\\n translateH pointCoord = viewBox.xy + (viewBox.z - viewBox.xy * * .5 + highp GLSLIFY 1\\n\\nunifo sampler2D vec2 float charsStep, pixelRatio vec4 vec4 vec2 vec2 float float main() {\\n\\tvec2 pointUV = (pointCoor - + pointSize * .5) / = 1. - texCoord = ((charId + pointUV) * charsStep) / dist = alpha\\n\\ti (dist t;){var w.push(new i(){var a(t,e){var e=void null;var number of characters is more than maximum texture size. Try reducing x=0;x 1.0) {\\n discard;\\n }\\n vec4 baseColor = color, float alpha = 1.0 - pow(1.0 - baseColor. fragWeight gl_FragCol = * alpha, highp GLSLIFY 1\\n\\nvec4 pfx_1_0(ve scaleHi, vec2 scaleLo, vec2 translateH vec2 translateL vec2 positionHi vec2 positionLo {\\n return + translateH * scaleHi\\n + (positionL + translateL * scaleHi\\n + (positionH + translateH * scaleLo\\n + (positionL + translateL * scaleLo, 0.0, vec2 positionHi vec4 vec2 scaleHi, scaleLo, translateH float vec4 vec4 main() {\\n\\n vec4 id = pickId + pickOffset id.y += floor(id.x / 256.0);\\n id.x -= floor(id.x / 256.0) * 256.0;\\n\\n id.z += floor(id.y / 256.0);\\n id.y -= floor(id.y / 256.0) * 256.0;\\n\\n id.w += floor(id.z / 256.0);\\n id.z -= floor(id.z / 256.0) * 256.0;\\n\\n gl_Positio = scaleLo, translateH translateL positionHi positionLo gl_PointSi = pointSize; fragId = mediump GLSLIFY 1\\n\\nvaryi vec4 main() {\\n float radius = length(2.0 * - 1.0);\\n if(radius > 1.0) {\\n discard;\\n }\\n gl_FragCol = fragId / i(t,e){var e(e,r){ret e in n(t,e){var in r)return r[t];for(v o=r.gl d(t){var null;var a(t,e){ret new E=new i(t,e){var r=new n(t);retur 0.0 ||\\n || {\\n discard;\\n }\\n\\n vec3 N = vec3 V = vec3 L = {\\n N = -N;\\n }\\n\\n float specular = V, N, roughness) float diffuse = min(kambie + kdiffuse * max(dot(N, L), 0.0), 1.0);\\n\\n //decide how to interpolat color \\u2014 in vertex or in fragment\\n vec4 surfaceCol = .5) * vec2(value value)) + step(.5, vertexColo * vColor;\\n\\ vec4 litColor = surfaceCol * vec4(diffu * + kspecular * vec3(1,1,1 * specular, 1.0);\\n\\n gl_FragCol = mix(litCol contourCol contourTin * mediump GLSLIFY 1\\n\\nattri vec4 uv;\\nattri float f;\\n\\nunif mat3 mat4 model, view, float height, sampler2D float value, kill;\\nvar vec3 vec2 vec3 eyeDirecti vec4 main() {\\n vec3 dataCoordi = permutatio * vec3(uv.xy height);\\n vec4 worldPosit = model * 1.0);\\n\\n vec4 clipPositi = projection * view * clipPositi = clipPositi + zOffset;\\n gl_Positio = value = f;\\n kill = -1.0;\\n = = uv.zw;\\n\\n vColor = vec2(value value));\\n //Don't do lighting for contours\\n surfaceNor = vec3(1,0,0 eyeDirecti = vec3(0,1,0 lightDirec = mediump GLSLIFY 1\\n\\nunifo vec2 vec3 float float value, kill;\\nvar vec3 vec2 vec3 v) {\\n float vh = 255.0 * v;\\n float upper = floor(vh); float lower = fract(vh); return vec2(upper / 255.0, floor(lowe * 16.0) / main() {\\n if(kill > 0.0 ||\\n || {\\n discard;\\n }\\n vec2 ux = / shape.x);\\ vec2 uy = / shape.y);\\ gl_FragCol = vec4(pickI ux.x, uy.x, ux.y + i(t){var o(t,e){var new invalid coordinate for new Invalid texture size\");ret s(t,e){ret new Invalid ndarray, must be 2d or 3d\");var new Invalid shape for new Invalid shape for pixel new Incompatib texture format for new Invalid texture new Floating point textures not supported on this platform\") s=u(t);ret s=u(t);ret f(t,e){var new Invalid texture size\");var new Invalid shape for new Invalid shape for pixel b=u(t);ret new Error(\"gl- Too many vertex n(t,e,r){v i=new n(t){for(v n(t,e){var n(t,e,r){v instanceof a=new a(t,e){ret o(t){for(v e=[\"functi orient(){v orient\");v n=new a(t,e){var o(t,e){var s(t,e){var i}}functio c(t,e){for s(this,t); s(this,t); b}for(var r}return n}return l}function i(t,e,r,n) n(t,e){var r;if(h(t)) new Error('Unk function type -1 and 1 => 1\\n // In the texture normal, x is 0 if the normal points straight up/down and 1 if it's a round cap\\n // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = mod(a_pos, 2.0);\\n normal.y = sign(norma - 0.5);\\n v_normal = normal;\\n\\ float inset = u_gapwidth + (u_gapwidt > 0.0 ? u_antialia : 0.0);\\n float outset = u_gapwidth + u_linewidt * (u_gapwidt > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset = u_offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n // Remove the texture normal bit of the position before scaling it with the\\n // model/view matrix.\\n gl_Positio = u_matrix * * 0.5) + (offset + dist) / u_ratio, 0.0, 1.0);\\n\\n // position of y on the screen\\n float y = gl_Positio / // how much features are squished in the y direction by the tilt\\n float squish_sca = / * // how much features are squished in all directions by the float = 1.0 / (1.0 - min(y * u_extra, 0.9));\\n\\n v_linewidt = vec2(outse inset);\\n v_gamma_sc = * mediump lowp\\n#def float vec2 vec2 vec2 vec2 vec2 vec2 float float sampler2D vec2 vec2 float float main() {\\n // Calculate the distance of the pixel from the line in pixels.\\n float dist = * // Calculate the antialiasi fade factor. This is either when fading in\\n // the line in case of an offset line or when fading out\\n // float blur = u_blur * float alpha = clamp(min( - (v_linewid - blur), v_linewidt - dist) / blur, 0.0, 1.0);\\n\\n float x_a = / 1.0);\\n float x_b = / 1.0);\\n float y_a = 0.5 + (v_normal. * v_linewidt / float y_b = 0.5 + (v_normal. * v_linewidt / vec2 pos_a = vec2(x_a, y_a));\\n vec2 pos_b = vec2(x_b, y_b));\\n\\n vec4 color = pos_a), pos_b), u_fade);\\n alpha *= u_opacity; gl_FragCol = color * gl_FragCol = highp lowp\\n#def floor(127 / 2) == 63.0\\n// the maximum allowed miter limit is 2.0 at the moment. the extrude normal is\\n// stored in a byte (-128..127 we scale regular normals up to length 63, but\\n// there are also \\\"special\\ normals that have a bigger length (of up to 126 in\\n// this case).\\n// #define scale 63.0\\n#def scale We scale the distance before adding it to the buffers so that we can store\\n// long distances for long segments. Use this value to unscale the vec2 vec4 mat4 mediump float mediump float mediump float mediump float mediump float mat2 mediump float vec2 vec2 float float main() {\\n vec2 a_extrude = a_data.xy - 128.0;\\n float a_directio = mod(a_data 4.0) - 1.0;\\n float a_linesofa = / 4.0) + a_data.w * 64.0) * // We store the texture normals in the most insignific bit\\n // transform y so that 0 => -1 and 1 => 1\\n // In the texture normal, x is 0 if the normal points straight up/down and 1 if it's a round cap\\n // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = mod(a_pos, 2.0);\\n normal.y = sign(norma - 0.5);\\n v_normal = normal;\\n\\ float inset = u_gapwidth + (u_gapwidt > 0.0 ? u_antialia : 0.0);\\n float outset = u_gapwidth + u_linewidt * (u_gapwidt > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset = u_offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n // Remove the texture normal bit of the position before scaling it with the\\n // model/view matrix.\\n gl_Positio = u_matrix * * 0.5) + (offset + dist) / u_ratio, 0.0, 1.0);\\n v_linesofa = // position of y on the screen\\n float y = gl_Positio / // how much features are squished in the y direction by the tilt\\n float squish_sca = / * // how much features are squished in all directions by the float = 1.0 / (1.0 - min(y * u_extra, 0.9));\\n\\n v_linewidt = vec2(outse inset);\\n v_gamma_sc = * mediump lowp\\n#def lowp vec4 lowp float float sampler2D float float vec2 vec2 vec2 vec2 float main() {\\n // Calculate the distance of the pixel from the line in pixels.\\n float dist = * // Calculate the antialiasi fade factor. This is either when fading in\\n // the line in case of an offset line or when fading out\\n // float blur = u_blur * float alpha = clamp(min( - (v_linewid - blur), v_linewidt - dist) / blur, 0.0, 1.0);\\n\\n float sdfdist_a = v_tex_a).a float sdfdist_b = v_tex_b).a float sdfdist = mix(sdfdis sdfdist_b, u_mix);\\n alpha *= smoothstep - u_sdfgamma 0.5 + u_sdfgamma sdfdist);\\ gl_FragCol = u_color * (alpha * gl_FragCol = highp lowp\\n#def floor(127 / 2) == 63.0\\n// the maximum allowed miter limit is 2.0 at the moment. the extrude normal is\\n// stored in a byte (-128..127 we scale regular normals up to length 63, but\\n// there are also \\\"special\\ normals that have a bigger length (of up to 126 in\\n// this case).\\n// #define scale 63.0\\n#def scale We scale the distance before adding it to the buffers so that we can store\\n// long distances for long segments. Use this value to unscale the vec2 vec4 mat4 mediump float mediump float mediump float mediump float vec2 float vec2 float float mat2 mediump float vec2 vec2 vec2 vec2 float main() {\\n vec2 a_extrude = a_data.xy - 128.0;\\n float a_directio = mod(a_data 4.0) - 1.0;\\n float a_linesofa = / 4.0) + a_data.w * 64.0) * // We store the texture normals in the most insignific bit\\n // transform y so that 0 => -1 and 1 => 1\\n // In the texture normal, x is 0 if the normal points straight up/down and 1 if it's a round cap\\n // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = mod(a_pos, 2.0);\\n normal.y = sign(norma - 0.5);\\n v_normal = normal;\\n\\ float inset = u_gapwidth + (u_gapwidt > 0.0 ? u_antialia : 0.0);\\n float outset = u_gapwidth + u_linewidt * (u_gapwidt > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset = u_offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n // Remove the texture normal bit of the position before scaling it with the\\n // model/view matrix.\\n gl_Positio = u_matrix * * 0.5) + (offset + dist) / u_ratio, 0.0, 1.0);\\n\\n v_tex_a = * normal.y * + u_tex_y_a) v_tex_b = * normal.y * + // position of y on the screen\\n float y = gl_Positio / // how much features are squished in the y direction by the tilt\\n float squish_sca = / * // how much features are squished in all directions by the float = 1.0 / (1.0 - min(y * u_extra, 0.9));\\n\\n v_linewidt = vec2(outse inset);\\n v_gamma_sc = * mediump lowp\\n#def mapbox: define lowp vec4 mapbox: define lowp float vec2 v_pos;\\n\\n main() {\\n #pragma mapbox: initialize lowp vec4 #pragma mapbox: initialize lowp float opacity\\n\\ float dist = length(v_p - float alpha = 0.0, dist);\\n gl_FragCol = outline_co * (alpha * gl_FragCol = highp lowp\\n#def vec2 mat4 vec2 vec2 mapbox: define lowp vec4 mapbox: define lowp float main() {\\n #pragma mapbox: initialize lowp vec4 #pragma mapbox: initialize lowp float opacity\\n\\ gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n v_pos = / gl_Positio + 1.0) / 2.0 * mediump lowp\\n#def float vec2 vec2 vec2 vec2 float sampler2D vec2 vec2 vec2 v_pos;\\n\\n main() {\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos = imagecoord vec4 color1 = pos);\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos2 = vec4 color2 = pos2);\\n\\n // find distance to outline for alpha float dist = length(v_p - float alpha = 0.0, dist);\\n \\n\\n gl_FragCol = mix(color1 color2, u_mix) * alpha * gl_FragCol = highp lowp\\n#def vec2 vec2 vec2 vec2 float float float vec2 mat4 vec2 vec2 vec2 vec2 v_pos;\\n\\n main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n vec2 scaled_siz = u_scale_a * vec2 scaled_siz = u_scale_b * // the correct offset needs to be calculated //\\n // The offset depends on how many pixels are between the world origin and\\n // the edge of the tile:\\n // vec2 offset = size)\\n //\\n // At high zoom levels there are a ton of pixels between the world origin\\n // and the edge of the tile. The glsl spec only guarantees 16 bits of\\n // precision for highp floats. We need more than that.\\n //\\n // The pixel_coor is passed in as two 16 bit values:\\n // = / 2^16)\\n // = 2^16)\\n //\\n // The offset is calculated in a series of steps that should preserve this precision: vec2 offset_a = scaled_siz * 256.0, scaled_siz * 256.0 + vec2 offset_b = scaled_siz * 256.0, scaled_siz * 256.0 + v_pos_a = * a_pos + offset_a) / v_pos_b = * a_pos + offset_b) / v_pos = / gl_Positio + 1.0) / 2.0 * mediump lowp\\n#def float vec2 vec2 vec2 vec2 float sampler2D vec2 vec2 main() {\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos = imagecoord vec4 color1 = pos);\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos2 = vec4 color2 = pos2);\\n\\n gl_FragCol = mix(color1 color2, u_mix) * gl_FragCol = highp lowp\\n#def mat4 vec2 vec2 vec2 vec2 float float float vec2 vec2 vec2 main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n vec2 scaled_siz = u_scale_a * vec2 scaled_siz = u_scale_b * // the correct offset needs to be calculated //\\n // The offset depends on how many pixels are between the world origin and\\n // the edge of the tile:\\n // vec2 offset = size)\\n //\\n // At high zoom levels there are a ton of pixels between the world origin\\n // and the edge of the tile. The glsl spec only guarantees 16 bits of\\n // precision for highp floats. We need more than that.\\n //\\n // The pixel_coor is passed in as two 16 bit values:\\n // = / 2^16)\\n // = 2^16)\\n //\\n // The offset is calculated in a series of steps that should preserve this precision: vec2 offset_a = scaled_siz * 256.0, scaled_siz * 256.0 + vec2 offset_b = scaled_siz * 256.0, scaled_siz * 256.0 + v_pos_a = * a_pos + offset_a) / v_pos_b = * a_pos + offset_b) / mediump lowp\\n#def float float sampler2D sampler2D vec2 vec2 float float float float vec3 main() {\\n\\n // read and cross-fade colors from the main and parent tiles\\n vec4 color0 = v_pos0);\\n vec4 color1 = v_pos1);\\n vec4 color = color0 * u_opacity0 + color1 * u_opacity1 vec3 rgb = color.rgb; // spin\\n rgb = vec3(\\n dot(rgb, dot(rgb, dot(rgb, // saturation float average = (color.r + color.g + color.b) / 3.0;\\n rgb += (average - rgb) * // contrast\\n rgb = (rgb - 0.5) * + 0.5;\\n\\n // brightness vec3 u_high_vec = vec3 u_low_vec = gl_FragCol = u_low_vec, rgb), gl_FragCol = highp lowp\\n#def mat4 vec2 float float vec2 vec2 vec2 vec2 main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n v_pos0 = / 32767.0) - 0.5) / u_buffer_s ) + 0.5;\\n v_pos1 = (v_pos0 * + mediump lowp\\n#def sampler2D sampler2D lowp float vec2 vec2 main() {\\n lowp float alpha = v_fade_tex * u_opacity; gl_FragCol = v_tex) * gl_FragCol = highp lowp\\n#def vec2 vec2 vec2 vec4 matrix is for the vertex mat4 mediump float bool vec2 vec2 vec2 vec2 main() {\\n vec2 a_tex = mediump float a_labelmin = a_data[0]; mediump vec2 a_zoom = a_data.pq; mediump float a_minzoom = a_zoom[0]; mediump float a_maxzoom = a_zoom[1]; // u_zoom is the current zoom level adjusted for the change in font size\\n mediump float z = 2.0 - u_zoom) - (1.0 - u_zoom));\\ vec2 extrude = * (a_offset / 64.0);\\n if {\\n gl_Positio = u_matrix * vec4(a_pos + extrude, 0, 1);\\n gl_Positio += z * } else {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1) + vec4(extru 0, 0);\\n }\\n\\n v_tex = a_tex / u_texsize; v_fade_tex = / 255.0, mediump lowp\\n#def sampler2D sampler2D lowp vec4 lowp float lowp float lowp float vec2 vec2 float main() {\\n lowp float dist = v_tex).a;\\ lowp float fade_alpha = lowp float gamma = u_gamma * lowp float alpha = - gamma, u_buffer + gamma, dist) * gl_FragCol = u_color * (alpha * gl_FragCol = highp lowp\\n#def float PI = vec2 vec2 vec2 vec4 matrix is for the vertex mat4 mediump float bool bool mediump float mediump float mediump float vec2 vec2 vec2 vec2 float main() {\\n vec2 a_tex = mediump float a_labelmin = a_data[0]; mediump vec2 a_zoom = a_data.pq; mediump float a_minzoom = a_zoom[0]; mediump float a_maxzoom = a_zoom[1]; // u_zoom is the current zoom level adjusted for the change in font size\\n mediump float z = 2.0 - u_zoom) - (1.0 - u_zoom));\\ // map\\n // map | viewport\\n if {\\n lowp float angle = ? (a_data[1] / 256.0 * 2.0 * PI) : u_bearing; lowp float asin = sin(angle) lowp float acos = cos(angle) mat2 RotationMa = mat2(acos, asin, -1.0 * asin, acos);\\n vec2 offset = RotationMa * a_offset;\\ vec2 extrude = * (offset / 64.0);\\n gl_Positio = u_matrix * vec4(a_pos + extrude, 0, 1);\\n gl_Positio += z * // viewport\\n // map\\n } else if {\\n // foreshorte factor to apply on pitched maps\\n // as a label goes from horizontal vertical in angle\\n // it goes from 0% foreshorte to up to around 70% lowp float pitchfacto = 1.0 - cos(u_pitc * sin(u_pitc * 0.75));\\n\\ lowp float lineangle = a_data[1] / 256.0 * 2.0 * PI;\\n\\n // use the lineangle to position points a,b along the line\\n // project the points and calculate the label angle in projected space\\n // this calculatio allows labels to be rendered unskewed on pitched maps\\n vec4 a = u_matrix * vec4(a_pos 0, 1);\\n vec4 b = u_matrix * vec4(a_pos + 0, 1);\\n lowp float angle = - b[0]/b[3] - a[0]/a[3]) lowp float asin = sin(angle) lowp float acos = cos(angle) mat2 RotationMa = mat2(acos, -1.0 * asin, asin, acos);\\n\\n vec2 offset = RotationMa * 1.0) * a_offset); vec2 extrude = * (offset / 64.0);\\n gl_Positio = u_matrix * vec4(a_pos 0, 1) + vec4(extru 0, 0);\\n gl_Positio += z * // viewport\\n // viewport\\n } else {\\n vec2 extrude = * (a_offset / 64.0);\\n gl_Positio = u_matrix * vec4(a_pos 0, 1) + vec4(extru 0, 0);\\n }\\n\\n v_gamma_sc = (gl_Positi - 0.5);\\n\\n v_tex = a_tex / u_texsize; v_fade_tex = / 255.0, mediump lowp\\n#def float float float float main() {\\n\\n float alpha = 0.5;\\n\\n gl_FragCol = vec4(0.0, 1.0, 0.0, 1.0) * alpha;\\n\\n if > u_zoom) {\\n gl_FragCol = vec4(1.0, 0.0, 0.0, 1.0) * alpha;\\n }\\n\\n if (u_zoom >= v_max_zoom {\\n gl_FragCol = vec4(0.0, 0.0, 0.0, 1.0) * alpha * 0.25;\\n }\\n\\n if >= u_maxzoom) {\\n gl_FragCol = vec4(0.0, 0.0, 1.0, 1.0) * alpha * 0.2;\\n highp lowp\\n#def vec2 vec2 vec2 mat4 float float float main() {\\n gl_Positio = u_matrix * vec4(a_pos + a_extrude / u_scale, 0.0, 1.0);\\n\\n v_max_zoom = a_data.x;\\ = vec4 values, const float t) {\\n if (t 7)return[n have been deprecated as of v8\")];if(! in \"%s\" not strict\";va a(l,e,\"arr expected, %s a(l,e,\"arr length %d expected, length %d r?[new have been deprecated as of v8\")]:[];v n(e,r,\"obj expected, %s found\",a)] o=[];for(v s in must start with \"@\"'));ret strict\";va one of [%s], %s strict\";va t(e){var n(l,s,\"arr expected, %s n(l,s,'\"$t cannot be use with operator n(l,s,'fil array for operator \"%s\" must have 3 expected, %s key cannot be a functions not functions not strict\";va url must include a \"{fontstac url must include a \"{range}\" strict\";va n(c,r,'eit \"type\" or \"ref\" is i(e,r,\"%s is greater than the maximum value strict\";va n(e,r,\"obj expected, %s f in r){var property in n(e,r,'mis required property strict\";va i(e,o,'unk property strict\";va n(r,e,'\"ty is e)for(var c in a(t){retur Sans Unicode MS new new M=new in n){for(var many symbols being rendered in a tile. See many glyphs being rendered in a tile. See exceeds allowed extent, reduce your vector tile buffer size\")}ret new new Error(\"Inv LngLat object: (\"+t+\", new new x(){return y(){return point(){re new new new new instanceof 0===s&&voi a(void new Error(\"fai to invert strict\";va n={\" strict\";va s(t){retur l(t,e,r,n) o=(new out of n(t,e){ret mapbox: ([\\w]+) ([\\w]+) ([\\w]+) a=new n?e(new Error(\"Inp data is not a valid GeoJSON t.data)ret e(new Error(\"Inp data is not a valid GeoJSON e(new Error(\"Inp data is not a valid GeoJSON e=0;ee)){v y;for(y in in p)c[y]=!0; t in new new i(t,e,i){v r(t,r){ret delete e(t);var n=new o(new new e=new in tile source layer \"'+M+'\" does not use vector tile spec v2 and therefore may have some rendering g(t,L);var F in B in n=new t.time>=(n void void t=new new i;var strict\";va new Error(\"Inv color o[e]}throw new Error(\"Inv color void n in r in Error('Sou layer does not exist on source \"'+e.id+'\" as specified by style layer t in t.id});for new Error(\"Sty is not done new Error(\"The is no source with this ID\");var delete instanceof this;var 0===e)thro new Error(\"The is no layer with this ID\");for(v r in this;var void 0===i||voi 0===a?void strict\";va i(t){retur t.value}va r,n;for(va i in t){var in for(n in in in 0===e)dele 0===e)dele o}var strict\";va new t){var this.grid= a}if(r){va _=u;for(va a}}}return r=new r(\"glyphs > 65535 not i=!t&&new l(new c(new g(e,r){var y(e,r){var i(0,0));re M in a)t[M]=new strict\";va t){var | n(){}var i(t){retur new 61:case 107:case 171:case 189:case 109:case t=0,e=0;re t=new null!==t&& new Error(\"max must be between the current minZoom and 20, t,e={};ret t instanceof e;if(t instanceof instanceof c?t:new i(this,e); void Error(\"Fai to initialize s in if(void if(void n(t){var r=new n(t){for(v e=0;e1)for delete error c(t,e,r){v f(t,e){for t in null;var delete new Error(\"An API access token is required to use Mapbox GL. See new Error(\"Use a public access token (pk.*) with Mapbox GL JS, not a secret access token (sk.*). See t}function i(t){retur a(t){retur t;var n(t){funct v[n];void in t=0;t=1)re 1;var void t={};for(v e in =0.22.0 =0.22.0 No README data run build-docs # invoked by publisher when publishing docs on the mb-pages --debug --standalo mapboxgl > && tap --no-cover build --github --format html -c --theme ./docs/_th --output --debug -t unassertif --plugin [minifyify --map --output --standalo mapboxgl > && tap --no-cover --debug -t envify > --ignore-p .gitignore js test bench diff --name-onl mb-pages HEAD -- | awk '{print | xargs build-toke watch-dev watch-benc build-toke watch-benc build-toke watch-dev run build-min && npm run build-docs && jekyll serve --no-cache --localhos --port 9966 --index index.html .\",test:\"n run lint && tap --reporter dot test/js/*/ && node && watchify bench/inde --plugin [minifyify --no-map] -t [babelify --presets react] -t unassertif -t envify -o bench/benc --debug --standalo mapboxgl -o n=new r=new r(t){var n(t,n){var i(t){retur t)return t){var 1=0)return V=1;V specify vertex creation specify cell creation specify phase strict\";va n(t){if(t in l)return l[t];for(v new Invalid boundary dst;};retu t in l){var t in u){var t in c){var return \"+s),u){va p=new p=new p()}functi for(var o=0;o1)for f(e,r){var s=\"__l\"+ i=\"__l\"+ _=[\"'use L=new L=new L(r)}funct s(t,e){var r=[\"'use [2,1,0];}e [1,0,2];}} [2,0,1];}e new new function new o=new 0===t){var 0===r){r=n o(t,e){var s(t,e){ret a(t,e){var i=new t||\"up\"in strict\";va r=void 0!==r?r+\"\" e(t,e){for t}function o)throw new to path.resol must be t)throw new to path.join must be n(t){for(v new Error(\"Giv varint doesn't fit into 10 bytes\");va o(t,e,r){v s(t,e){for new type: void n(t){var 0:return r||[];case 1:return 2:return Array(t);v r}var r(t,e){var Array(a),n n(t,e){for a(t){for(v t-e});var new t instanceof i(t){retur a(t){for(v a=1;i;){va l(t){for(v c(t){retur d(t){var u(m)}funct p(t){var 0x80 (not a basic code x});else for(_ in n(t,e){ret o;var o};var n(t,e){for n&&void e(t){var e=new Error(\"(re \"+t);throw n(t){retur t?\": i(t,r,i){t in r||e(\"unkn parameter possible values: parameter type\"+n(r) must be a typed parameter type\"+n(i) expected \"+r+\", got \"+typeof t)}functio parameter type, must be a nonnegativ shader source must be a string\",a) number \"+t+\": r=0;e(c(\"| compiling \"+s+\" shader, linking program with vertex shader, and fragment shader i(t){retur M(t,r){var n=m();e(t+ in command called from \"+n))}func A(t,e,r,i) in e||M(\"unkn parameter possible values: parameter type\"+n(r) expected \"+e+\", got \"+typeof texture format for renderbuff format for L(t,e){ret z(t,e,n){v pixel arguments to document,\" manually specify webgl context outside of DOM not supported, try upgrading your browser or graphics drivers name must be string\");v $(t){var et(t,e){va _e:r=new we:r=new Me:r=new ke:r=new Ae:r=new Te:r=new Se:r=new null}retur n=0;n0){va t[0]){var buffer data\")}els shape\");va data for buffer p=new n(a);retur d=[];retur t=0;return t&&t._buff instanceof a(t){var e||(e=new Ge:case Xe:case Ze:case type for element bit element buffers not supported, enable first\");va vertex count for buffer a}var t&&t._elem instanceof pt(t){for( At(t){retu Tt(t,e){va Or:case Fr:case Rr:case jr:var texture type, must specify a typed St(t,e){re for(var s}return o*r*n}func texture texture unpack n){var must enable the extension in order to use floating point must enable the extension in order to use 16-bit floating point must enable the extension in order to use depth/sten texture must be an extension not extension not d(e,r,i){v m(){return K.pop()||n h}function y(t,e,r){v b(t,e){var e){var e){var e){var e){var e){var i(t,e){var arguments to format for c=new T(nr);retu format for C=new z=new I(){for(va for(var P={\"don't care\":$r,\" mipmap mipmap mipmap mipmap s3tc dxt1\":Mr,\" s3tc dxt1\":kr,\" s3tc dxt3\":Ar,\" s3tc atc\":Sr,\"r atc explicit atc interpolat pvrtc pvrtc pvrtc pvrtc etc1\"]=Pr) r=B[e];ret null});ret number of texture shape for z||\"colors render targets not color buffer must enable in order to use floating point framebuffe must enable in order to use 16-bit floating point framebuffe must enable to use 16-bit render must enable in order to use 32-bit floating point color color format for color format for extension not u=d=1;var for(D=new color attachment \"+a+\" is color attachment much have the same number of bits per depth attachment for framebuffe stencil attachment for framebuffe depth-sten attachment for framebuffe not resize a framebuffe which is currently in use\");var i;for(var shape for framebuffe must be be d||\"colors render targets not color buffer color color format for l=1;var a(t){var t=0;return vertex fragment shader\",n) a=i[t];ret a||(a=new o(o){var must create a webgl context with in order to read pixels from the drawing cannot read from a from a framebuffe is only allowed for the types 'uint8' and from a framebuffe is only allowed for the type 'uint8'\")) arguments to buffer for regl.read( too s(t){var r;return l(t){retur l}function jt(t){retu Nt(t){retu Bt(){funct t(t){for(v r(){functi n(){var e=a();retu n(){var new new m(t){retur v(t,e,r){v g(t,e,r){v y(){var ei:var ri:return ni:return ii:return ai:return c={};retur n=e.id(t); in c)return c[n];var b(t){var in r){var if(Di in n){var e}function x(t,e){var in r){var i=r[Pi];re framebuffe in n){var a=n[Pi];re framebuffe null}funct n(t){if(t in i){var in a){var \"+t)});var in in e?new s=o;o=new w(t){funct r(t){if(t in i){var r});return n.id=r,n}i in a){var o=a[t];ret null}var r(t,r){if( in n){var in i){var s=i[t];ret in n){var in i){var o=i[Ri];re in n){var t=n[ji];re Be[t]})}if in i){var r=i[ji];re in \"+n,\"inval primitive, must be one of Aa}):new in n){var vertex t})}if(Ni in i){var r=i[Ni];re vertex s?new vertex offset/ele buffer too l=new k(t,e){var o(e,n){if( in r){var o})}else if(t in i){var vi:case si:case oi:case Ai:case hi:case Ci:case xi:case wi:case Mi:case pi:return flag fi:return in \"+i,\"inval \"+t+\", must be one of di:return color attachment for framebuffe sent to uniform data for uniform a[r],\"inva uniform or missing data for uniform T(t,r){var a&&a,\"inva data for attribute offset for attribute divisor for attribute parameter \"'+r+'\" for attribute pointer \"'+t+'\" (valid parameters are in r)return r[s];var in '+a+\"&&(ty dynamic attribute if(\"consta in \"+a+'.cons === in S(t){var a(t){var parameter L(t,e,r){v C(t,e,r,n) z(t,e,r){v n=m(e);if( in r.state)){ c,h;if(n in in I(t,e,r,n) if(mt(u)){ l(t){var ua:case da:case ga:return 2;case ca:case pa:case ya:return 3;case ha:case ma:case ba:return 1}}functio attribute i(i){var a=c[i];ret a(){functi o(){functi vertex vertex vertex i(t){retur n(e){var n=r.draw[e s(t){funct e(t){var args to args to e(t){if(t in r){var e=r[t];del delete l(t,e){var regl.clear with no buffer takes an object as cancel a frame callback must be a h(){var callback must be a function\") event, must be one of Kt={\"[obje renderbuff renderbuff arguments to renderbuff r(){return i(t){var s(){return p.pop()||n o}function u(t,e,r){v c(){var t(){var new requires at least one argument; got none.\");va e.href;var \",e);var s=new o;n=-(i+a) null;var n(t){retur n(t){for(v R;};return i(t){var e=s[t];ret strict\";\"u n(t){for(v i}function h(t,e){for r=new r}function r=new l(e)}funct u(t){for(v e=s(t);;){ t=k[0];ret f(t,e){var r=k[t];ret n(t,e){var l}else if(u)retur l}else if(u)retur u;return i(t,e){ret t.y-e}func a(t,e){for r=null;t;) t;var r}function l(t){for(v n=d.index; n(t,e){var i(t,e,r,n) o(t,e){for r}function s(t,e){for m}function s[t];for(v new unexpected new failed to parse named argument new failed to parse named argument new mixing positional and named placeholde is not (yet) s[t]=n}var n(t){for(v Array(e),n Array(e),i Array(e),a Array(e),o Array(e),s x=new u(t){retur c(t){var h(t){retur f(t){var d(t,e){for r in t}function p(t){retur t.x}functi m(t){retur t.y}var time\");var r=\"prepare \"+t.length %d clusters in c)|0 p=new Array(r),m Array(r),v Array(r),g p=new o}function s}function T(t){retur n=z(t);ret t){var r={};for(v i in e={};for(v r in n(t,e){var i(t,e){var s/6}return 1}var n&&void e(t,e){var for(a=0,n= n})}}var s;var in new Error(\"n must be new Error(\"alr s(t){retur new l(t){retur new u(t){retur new c(t){retur new h(t){retur new f(t){retur new d(t){retur new p(t){retur new m(t){retur x?new v(t){retur new n(t)}var null}retur t=0;tn)ret instanceof n)return t;var i=new n;return a(t){retur instanceof o(t,e){ret s(t,e){ret new 'url' must be a string, not \"+typeof t);var i(t,e){var a(t,e){var o(t,e){ret t}function s(t){var e={};retur a;var v=e.name?\" c(e)}var o+\": \"+s}functi d(t,e,r){v n=0;return \")+\" \"+t.join(\" \")+\" \"+t.join(\" \")+\" p(t){retur t}function v(t){retur g(t){retur t}function t}function t}function _(t){retur void 0===t}func w(t){retur M(t)&&\"[ob k(t){retur M(t)&&\"[ob A(t){retur instanceof t}function S(t){retur t||void 0===t}func E(t){retur L(t){retur t=a)return new Error(\"unk command if(7!==r)t new Error(\"unk command i(t){for(v e}var new Error(\"fea index out of new new String too long (sorry, this will get fixed later)\");v l(t){for(v e(t){var e=n(t);ret e?u in r(t,e){var o(t){var i?u in i&&delete t){var r?r[0]:\"\"} n?!r&&en)t al-ahad\",\" {0} not {0} {0} {0} mix {0} and {1} a(t,e){ret ;var format a date from another number at position name at position literal at position text found at dd M MM d, d M d M d M d M yyyy\",RSS: d M a=this;ret var _inline_1_ = - var _inline_1_ = - >= 0) !== (_inline_1 >= 0)) {\\n + 0.5 + 0.5 * (_inline_1 + _inline_1_ / (_inline_1 - }\\n n(t,e){var r=[];retur strict\";va u(r,i){ret i(t,e){var void E.remove() void null;var strict\";va void c();var t}function i(t){var e=x[t];ret a(t){retur the calendar system to use with `\"+t+\"` date data.\"}var i={};retur t}var i?\"rgba(\"+ n=i(t);ret t){var A(e,r){var T(){var void strict\";va strict\";va strict\";va strict\";va strict\";va strict\";va n(){var e(e){retur r;try{r=ne strict\";va i(t,e,r,n) a(t){var void n.remove() void \")}).split \")}).split scale(\"+e+ n,i,a;retu strict\";va 0 1,1 0 0,1 \"+a+\",\"+a+ 0 0 1 \"+a+\",\"+a+ 0 0 1 \"+r+\",\"+r+ 0 0 1 \"+r+\",\"+r+ 0 0 1 0 1,1 0 0,1 0 1,1 0 0,1 n(t,e,r,n) t.id});var strict\";va strict\";va i(t,e,r){v r(t){var void r.remove() r(e,r,o){v if(i[r]){v o;if(void strict\";va n(t){var n(r){retur strict\";va n(t){for(v \");var i(t,e){var click on legend to isolate individual l(t){var u(t){var strict\";va r[1]}retur i}function i(t){retur t[0]}var h(t){var f(t){var d(t){var n(t,e){var i(t){for(v n(t){for(v 0}}var o(t,e){var 0 1,1 0 0,1 extra params in segment t(e).repla strict\";va strict\";va u(r,i){ret r(t,e){ret l(t,e,r){v u(t,e,r){v c(t,e){var n(){return p(t,e){var g(t,e){ret y(t,e){ret b(t,e,r){v x(t,e){var _(t){for(v r(t,e){ret strict\";va strict\";va strict\";va t){var void t)return void void n}function l(t){retur u(t){retur c(t){retur d\")}functi h(t){retur d, yyyy\")}var t.getTime} r={};retur n=new a(t){retur o(t){for(v r={};retur n(){return strict\";va for(var c(t){retur void property r(t,e){var instanceof RegExp){va void o(t,e){ret t>=e}var binary r=e%1;retu n(t){var e=i(t);ret n(t,e){ret i(t){retur \")}functio a(t,e,r){v was an error in the tex null;var r=0;r1)for i=1;i doesnt match end tag . Pretending it did s}function c(t,e,r){v o(),void e();var 0,\":\"],k=n t(t,e){ret void n(t){var i(){var 1px new strict\";va strict\";va n(t,e){for r=new new Error(\"No DOM element with id '\"+t+\"' exists on the page.\");re 0===t)thro new Error(\"DOM element provided is null or previous rejected promises from t.yaxis1); array edits are incompatib with other edits\",h); full array edit out of if(void & removal are incompatib with edits to the same full object edit new Error(\"eac index in \"+r+\" must be new Error(\"gd. must be an 0===e)thro new is a required new Error(\"cur and new indices must be of equal u(t,e,r){v new Error(\"gd. must be an 0===e)thro new Error(\"tra must be in in i(t){retur a(t,e){var r=0;return new Error(\"Thi element is not a Plotly plot: \"+t+\". It's likely that you've failed to create a plot before animating it. For more details, see void c()}functi d(t){retur overwritin frame with a frame whose name of type \"number\" also equates to \"'+f+'\". This is valid but may potentiall lead to unexpected behavior since all plotly.js frame names are stored internally as This API call has yielded too many warnings. For the rest of this call, further warnings about numeric frame names will be addFrames accepts frames with numeric names, but the numbers areimplici cast to n(t){var i}function i(){var t={};retur a(t){var o(){var s(t){retur l(t){funct u(t){funct c(t){retur h(t,e,r){v f(t,e,r){v e={};retur t&&void n(t){retur Error(\"Hei and width should be pixel values.\")) l(t,e,r){v u(t,e,r,n) \"+o:s=o+(s dtick p(t,e){var c=new t.dtick){v error: t+i*e;var dtick a(t){for(v strict\";va v(r,n){ret to enter axis\")+\" e;var n(t,r){for n(t,e,r,n) u(t,e){ret y(t){var b(t,e,r){v back X(e,r){var K()}functi W(e){funct n(e){retur void k.log(\"Did not find wheel motion attributes \",e);var strict\";va n(t){retur t._id}func went wrong with axis Error(\"axi in in in o){var t(t){var e(t){retur strict\";va r(r,n){var e/2}}funct v(t,e){var g(t,e){var b(t,e){var x(t,e){var new Error(\"not yet r(t,r){for i(){for(va a(t,e){for n(t,e){var n(t){retur i(t,e,r,n) a(t,e){ret i(t,e){var r(t){retur n(t){var l(t,e){var u(t){var c(t,e){var f(t,e,r){v strict\";va i(t){var e=new n;return a(t){var o(t){var i=new n(t,e);ret strict\";va Sans Regular, Arial Unicode MS r(t,e){ret - delete t)return e,n,i={};f in i}return r=a(t);ret e&&delete P=(new + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + 0px\",\"1px -1px\",\"-1p 1px\",\"1px \"+t+\" 0 \"+n+\" \"+n+\" \"+n+\" void c=\"t: \"+u.t+\", r: l;var r in t)r in r in r=e||6;ret 0===t)retu null;var void t(){var t={};retur n.mode,del strict\";va e(e,i){ret s;return t(t,e){ret i=r[n];ret strict\";va t(t,e,r){v e(t,e){ret r(t,e){ret a(t,i){var tozoom back f(t,e){var i(t){retur y,b;return o,s;return ii))return e}return void h(t){retur f(t,e){ret void strict\";va strict\";va 0, 0, strict\";va s;return r in void strict\";va null;for(v strict\";va o(e){var s(e){var strict\";va n(t,e,r){v i(t,e,r){v strict\";va converged strict\";va strict\";va strict\";va strict\";va s(r,i){ret n(t,e,r,n) strict\";va n(t,e){for o(t){retur strict\";va void strict\";va strict\";va c(r,i){ret loop in contour?\") s(t,e,r){v 15===r?0:r many contours, clipping at i}function a(t,e,r){v o(t,e,r){v s(t,e,r,n) e=l(t,r) r(t){retur to newendpt is not vert. or perimeter scale is not scale is not void data invalid for the specified inequality many contours, clipping at strict\";va strict\";va h(t){retur to newendpt is not vert. or perimeter o(t,e,r){v s(t,e,r){v scale is not scale is not strict\";va iterated with no new in strict\";va g}var didn't converge strict\";va s=0;sa){va in strict\";va l(r,n){ret u(t){var e=l(t);ret strict\";va strict\";va e(e){var strict\";va strict\";va void r(t,e){ret traces support up to \"+u+\" dimensions at the c}var l(r,n){ret strict\";va l(n){var i}function c(t,e,r){v l(t,e,r){v n=o(r);ret u(t,e){ret c(t){retur h(t){var e=o(t);ret f(t){var d(t){retur t[0]}funct p(t,e,r){v m(t){var v(t){retur l(t){var u(t){retur c(t,e){for e.t+\"px \"+e.r+\"px \"+e.b+\"px 255, 255, 0)\");var 1px 1px #fff, -1px -1px 1px #fff, 1px -1px 1px #fff, -1px 1px 1px strict\";va i(t,e,r){v strict\";va n(t,e){for m};var strict\";va o(r,a){ret strict\";va strict\";va strict\";va n(t,e,r){v u;var 1;var a(t,e){var r(t,e){ret n(t,e){ret s(t,e){var 1;var t+\" void strict\";va strict\";va strict\";va 0, i(t,e){var r=new for(r=new is present in the Sankey data. Removing all nodes and strict\";va u(r,a){ret n(t){retur t.key}func a(t){retur t[0]}funct o(t){var 0 0 1 0 0)\":\"matri 1 1 0 0 0)\")}funct M(t){retur k(t){retur 0 0 1 0 0)\":\"matri 1 1 0 0 0)\"}functi A(t){retur 1)\":\"scale 1)\"}functi T(t){retur S(t){retur L(t,e,r){v var C(t,e,r){v i(){for(va e={};retur 1px 1px #fff, 1px 1px 1px #fff, 1px -1px 1px #fff, -1px -1px 1px strict\";va _=new strict\";va void strict\";va strict\";va m(r,a){ret strict\";va strict\";va strict\";va r(e){var i(t){var strict\";va n(t,e){var + m(t){retur v(t){retur g(t){retur t.id}funct g}function x(e){var scatter strict\";va s(t,e){ret l(t){retur M[t]}funct o=0;o=0){v n(t,e,r,n) strict\";va d(r,i){ret s=o[0];if( 0;var v.push(\"y: strict\";va strict\";va e(t){retur r(t){var 1/0;var strict\";va n(t,e){var n}function s(t,e,r,n) n=new s(t){var 1/0;var strict\";va strict\";va strict\";va strict\";va d(r,i){ret strict\";va strict\";va e=f(t);ret e=f(t);ret e=f(t);ret e=f(t);ret In\u00a0[3]: ## Setup - appearance # get rid of the annoying warning = None # default='w # more than one print of an unassigned variable from import = \"all\"; # offline plotly color1 = 'red' color2 = '#137a28' # dark green exports, module) {/** * plotly.js v1.28.3 * Copyright 2012-2017, Plotly, Inc. * All rights reserved. * Licensed under the MIT license */ t;return function a(o,!0);va u=new Error(\"Can find module i(t,e){ret t.y-e.y}va r(t){retur r(t){retur u(){functi t(t,e){ret e(t,e){ret c(t){retur h(t){retur t.value}va t(t){var \"+s+\",\"+c+ \"+o+\",\"+u+ e=.5;retur n(t){var n=a(t,new e?e:1,r=r| \";var n(t,e){for r=new instanceof function y(t){var e}function b(t,e,r,n) e)throw new argument must be a expected unwanted i}var r=new t};var e=[];for(v r in n(t){for(v n(t){retur n(t){var i=new e=t|t-1;re new i}function l(t){for(v e=new e}function ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe ffffffff ffffffff ffffffff 00000000 00000000 ffffffff ffffffff fffffffe ffffffff t){var must be greater than t instanceof can only safely store up to 53 n(void array length 26;var e=t,r=0;re 0;for(var t&&t>=0);v t&&t>=0);v new a(1);for(v t&&t>=0);v works only with positive a(0),mod:n a(0)};var i,o,s;retu i=new a(1),o=new a(0),s=new a(0),l=new i=new a(1),o=new f;return A[t];var p;else m;else new Error(\"Unk prime \"+t);e=new g}return works only with works only with red works only with works only with red l}}}functi s(t){retur l(t,e){ret 1:return s(t);case 3:return new Invalid n(t,e,r){v \"+r);var new n(t,e){var i=\"for(var i=n[t];ret var P=C+1;PZ)t new typed array length\");v e=new e)throw new Error(\"If encoding is specified then the first argument must be a l(t)}retur t)throw new argument must not be a t instanceof t)throw new argument must be a new to allocate Buffer larger than maximum size: bytes\");re 0|t}functi instanceof 0;for(var void 0:return v(t,e,r){v n=!1;if((v new hex E(n)}funct E(t){var new to access beyond buffer new argument must be a Buffer new out of new out of a}function U(t){for(v a}function H(t){retur i}function Y(t){retur t!==t}var t=new browser lacks typed array (Uint8Arra support which is required by `buffer` v5.x. Use `buffer` v4.x if you require old browser new must be 0;for(var ... new must be a new of range 0;for(var 0)}var new to write outside buffer new encoding: new out of new Error(\"Inv string. Length must be a multiple of i(t){retur a(t){var o(t){retur i(t,e){ret a(t,e){for i(g,d,v,h) n(t){var i(t,e){for r=new s}function m(t,e,r){v i=new n(t){var strict\";va t;var new Error(f+\" map requires nshades to be at least size i(t,e,r,i) 0}return n(t,e){ret t-e}functi i(t,e){var 0:return 0;case 1:return t[0]-e[0]; 2:return 3:var i;var 4:var n(t){var t}function i(t){retur a(t){retur o(t){retur null}var null;var a}return a}return i(t){var e=new i=0;i0)thr new Error(\"cwi pre() block may not reference array new Error(\"cwi post() block may not reference array args\")}els new Error(\"cwi pre() block may not reference array new Error(\"cwi post() block may not reference array index\")}el new Error(\"cwi Too many arguments in pre() new Error(\"cwi Too many arguments in body() new Error(\"cwi Too many arguments in post() block\");re n(t,e,r){v l(t,e){for w=new cwise routine for new return n(t){var e=[\"'use strict'\",\" function (!(\"+l.joi && \")+\")) throw new Error('cwi Arrays do not all have the same {\"),e.push (!(\"+u.joi && \")+\")) throw new Error('cwi Arrays do not all have the same i(t,e,r){v m,v=new t;var e=[];for(v r in e=[];for(v r in e=[];for(v r in n&&void e(t,e){var n in r}function r(){}funct n(t){var e;return y(t){retur b(t){retur new Error(\"unk type: i(t,e){for r,n,i=new i(){if(s){ t(t){var r(t){var Array(o),l this;var 1:do{o=new 2:do{o=new 3:do{o=new t=[];retur s(){var l(){for(va a(t){retur n}}}functi l(t){retur u(t){for(v e}function c(t,e){for r in p(t){retur f(t)in this._&&de v(){var t=[];for(v e in t}function g(){var t=0;for(va e in t}function y(){for(va t in x(t){retur t}function function() w(t,e){if( in t)return Z(t,e){ret J(t,e){var K(t){var vt(t){retu gt(t){retu yt(t){retu _t(t){retu wt(t){retu kt(t,e,r){ Ot(){for(v t}function Ft(){for(v Nt(t){var b=u&&h;ret Bt(t){retu t+\"\"}funct n(e){var in e}function Gt(t,e,r){ le(t){var ce(t){for( ge(t){var ye(t,e){re we(t){var ke(t,e){re y}}functio Fe(t){retu Re(){var r=e;return r(t){var a(t,e){ret o(t,e){var l(t){for(v c(i,a){ret Je(){funct s}function 0 1,1 0 1,1 $e(){funct t(t,n){var er(){funct t(t,e){var rr(t){func s}function nr(t){func r(e){retur n(e){funct a(r,n){var k}function ir(t){var sr(t){retu t})()}func lr(t){func e(t){retur i(){return ur(t){retu r(t,e){var Vr(t,e){va n;var s;var Hr(t,e){va Vr(r,e);va Gr(t){for( r}function wn(t,e){va kn(t){retu An(t){retu 1;var Zn(t,e){va n}function bi(t){retu xi(t,e){re _i(t,e){re Ei(t){func Ni(t){retu t.y})}func Bi(t){retu Ui(t){var Vi(t){var qi(t,e){va a(t){retur o(t)}var o,s;return Qi(t,e){re $i(t,e){re a(t){retur o(e){retur t(i(e))}re _a(t){func e(e){funct Ma(t){retu ka(t){for( Aa(t){for( p[n]:delet t[r],1}var io(t){retu n(e){retur t(e)}funct i(t,r){var r};var t;var e=new b;if(t)for h(){functi f(){functi t(){var in r(){var n(){var o;if(i)ret i=!1,a;var e=new ms={\"-\":\"\" %b %e %X this.s}};v bs=new e(e,r){var t(){var e(){return }var new t(e,r,n,i) c}function e(t){for(v r}function t(t,a){var t(t,e){for i(t,e,r,n) \"+e}functi 0,0 \"+n}var t(t,i){var \"+l[2]+\" \"+l[3]}var 0,\"+e+\" \"+e+\",\"+e+ a(){functi v(){var l;var t;e||(e=t) e}function s(t){var l(t,e,r,n) u(t,e,r){v n=t;do{var n}function l=t;do{for h(t,e,r,n) r}function m(t,e,r,n) v(t){var t}function x(t,e){ret w(t,e){ret k(t,e){var A(t,e){ret n(t,e){var e){e=0;for warning: possible EventEmitt memory leak detected. %d listeners added. Use to increase must be a function\") n=!1;retur must be a e=typeof o(t,e,r,n) \"+i+\"=== typeof s(t,e){ret e.length> 1; if (a[m] === v) return true; if (a[m] > v) j = m - 1; else i = m + 1;}return false; }(\"+n+\", u(t){retur in p\"}functio h(t,e){ret c[1]){var s[e][t];va i(t){retur new a(t,e){ret new r}var 0)}functio d(t){for(v m(t){retur new t=[];retur t=[];retur 1:return 2:return new new new new this.tree; e=new i=0;i0)ret new Error(\"Can update empty node!\");va r=new new s(t){for(v z%d-%d-%d (features: %d, points: %d, simplified down to parent tile down\");var i(t,e,r){v s}function i(t,e,r,n) s(t,e){var r=new i(t);retur e(e,r,n){i in t){var U=g,V=_,k= 0.0) {\\n vec3 nPosition = mix(bounds bounds[1], 0.5 * (position + 1.0));\\n gl_Positio = projection * view * model * 1.0);\\n } else {\\n gl_Positio = }\\n colorChann = mediump GLSLIFY 1\\n\\nunifo vec4 vec3 main() {\\n gl_FragCol = colorChann * colors[0] + \\n colorChann * colors[1] +\\n colorChann * vectorizin d=new o(t,e,r,n) s;var r}function a(t,e){for r=0;rr)thr new If resizing buffer, must not specify a(t,e){for new Invalid type for webgl buffer, must be either or new Invalid usage for buffer, must be either gl.STATIC_ or t&&void new Cannot specify offset when resizing new Error(\"gl- Can't resize FBO, invalid new Error(\"gl- Parameters are too large for new Error(\"gl- Multiple draw buffer extension not new Error(\"gl- Context does not support \"+s+\" draw buffers\")} new Error(\"gl- Context does not support floating point h=!0;\"dept new Error(\"gl- Shape vector must be length 2\");var null;var 0.25) {\\n discard;\\n }\\n gl_FragCol = highp GLSLIFY 1\\n\\nattri vec2 aHi, aLo, vec4 pick0, vec2 scaleHi, translateH scaleLo, translateL float vec4 pickA, scHi, vec2 trHi, vec2 scLo, vec2 trLo, vec2 posHi, vec2 posLo) {\\n return (posHi + trHi) * scHi\\n + (posLo + trLo) * scHi\\n + (posHi + trHi) * scLo\\n + (posLo + trLo) * main() {\\n vec2 p = translateH scaleLo, translateL aHi, aLo);\\n vec2 n = width * * vec2(dHi.y -dHi.x)) / gl_Positio = vec4(p + n, 0, 1);\\n pickA = pick0;\\n pickB = mediump GLSLIFY 1\\n\\nunifo vec4 vec4 pickA, pickB;\\n\\n main() {\\n vec4 fragId = 0.0);\\n if(pickB.w > pickA.w) {\\n fragId.xyz = pickB.xyz; }\\n\\n fragId += fragId.y += floor(frag / 256.0);\\n fragId.x -= floor(frag / 256.0) * 256.0;\\n\\n fragId.z += floor(frag / 256.0);\\n fragId.y -= floor(frag / 256.0) * 256.0;\\n\\n fragId.w += floor(frag / 256.0);\\n fragId.z -= floor(frag / 256.0) * 256.0;\\n\\n gl_FragCol = fragId / highp GLSLIFY 1\\n\\nattri vec2 aHi, aLo, vec2 scaleHi, translateH scaleLo, translateL float projectVal scHi, vec2 trHi, vec2 scLo, vec2 trLo, vec2 posHi, vec2 posLo) {\\n return (posHi + trHi) * scHi\\n + (posLo + trLo) * scHi\\n + (posHi + trHi) * scLo\\n + (posLo + trLo) * main() {\\n vec2 p = translateH scaleLo, translateL aHi, aLo);\\n if(dHi.y e+n;var null;var FLOAT_MAX) {\\n return vec4(127.0 128.0, 0.0, 0.0) / 255.0;\\n } else if(v \"+t[1]+\", \"+t[2]+\", t=new e=new r=new \"+t[1]+\", n=\"precisi mediump GLSLIFY 1\\n\\nunifo vec3 float vec3 vec4 f_id;\\n\\nv main() {\\n || \\n {\\n discard;\\n }\\n gl_FragCol = vec4(pickI mediump GLSLIFY 1\\n\\nattri vec3 position, vec4 vec2 uv;\\n\\nuni mat4 model\\n , view\\n , vec3 eyePositio , vec3 f_normal\\n , , , vec4 vec2 f_uv;\\n\\nv main() {\\n vec4 m_position = model * vec4(posit 1.0);\\n vec4 t_position = view * m_position gl_Positio = projection * t_position f_color = color;\\n f_normal = normal;\\n f_data = position;\\ f_eyeDirec = eyePositio - position;\\ = lightPosit - position;\\ f_uv = mediump GLSLIFY 1\\n\\nfloat x, float roughness) {\\n float NdotH = max(x, 0.0001);\\n float cos2Alpha = NdotH * NdotH;\\n float tan2Alpha = (cos2Alpha - 1.0) / cos2Alpha; float roughness2 = roughness * roughness; float denom = * roughness2 * cos2Alpha * cos2Alpha; return exp(tan2Al / roughness2 / vec3 vec3 vec3 float roughness, float fresnel) {\\n\\n float VdotN = 0.0);\\n float LdotN = 0.0);\\n\\n //Half angle vector\\n vec3 H = + //Geometri term\\n float NdotH = H), 0.0);\\n float VdotH = H), 0.000001); float LdotH = H), 0.000001); float G1 = (2.0 * NdotH * VdotN) / VdotH;\\n float G2 = (2.0 * NdotH * LdotN) / LdotH;\\n float G = min(1.0, min(G1, G2));\\n \\n //Distribu term\\n float D = //Fresnel term\\n float F = pow(1.0 - VdotN, fresnel);\\ //Multiply terms and done\\n return G * F * D / max(3.1415 * VdotN, vec3 float roughness\\ , fresnel\\n , kambient\\n , kdiffuse\\n , kspecular\\ , sampler2D vec3 f_normal\\n , , , vec4 vec2 f_uv;\\n\\nv main() {\\n || \\n {\\n discard;\\n }\\n\\n vec3 N = vec3 L = vec3 V = \\n {\\n N = -N;\\n }\\n\\n float specular = V, N, roughness, fresnel);\\ float diffuse = min(kambie + kdiffuse * max(dot(N, L), 0.0), 1.0);\\n\\n vec4 surfaceCol = f_color * f_uv);\\n vec4 litColor = surfaceCol * vec4(diffu * + kspecular * vec3(1,1,1 * specular, 1.0);\\n\\n gl_FragCol = litColor * mediump GLSLIFY 1\\n\\nattri vec3 vec4 vec2 uv;\\n\\nuni mat4 model, view, vec4 vec3 vec2 f_uv;\\n\\nv main() {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n f_color = color;\\n f_data = position;\\ f_uv = mediump GLSLIFY 1\\n\\nunifo vec3 sampler2D float vec4 vec3 vec2 f_uv;\\n\\nv main() {\\n || \\n {\\n discard;\\n }\\n\\n gl_FragCol = f_color * f_uv) * mediump GLSLIFY 1\\n\\nattri vec3 vec4 vec2 uv;\\nattri float mat4 model, view, vec3 vec4 vec2 f_uv;\\n\\nv main() {\\n || \\n {\\n gl_Positio = } else {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n }\\n gl_PointSi = pointSize; f_color = color;\\n f_uv = mediump GLSLIFY 1\\n\\nunifo sampler2D float vec4 vec2 f_uv;\\n\\nv main() {\\n vec2 pointR = - if(dot(poi pointR) > 0.25) {\\n discard;\\n }\\n gl_FragCol = f_color * f_uv) * mediump GLSLIFY 1\\n\\nattri vec3 vec4 id;\\n\\nuni mat4 model, view, vec3 vec4 f_id;\\n\\nv main() {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n f_id = id;\\n f_position = mediump GLSLIFY 1\\n\\nattri vec3 float vec4 id;\\n\\nuni mat4 model, view, vec3 vec3 vec4 f_id;\\n\\nv main() {\\n || \\n {\\n gl_Positio = } else {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n gl_PointSi = pointSize; }\\n f_id = id;\\n f_position = mediump GLSLIFY 1\\n\\nattri vec3 mat4 model, view, main() {\\n gl_Positio = projection * view * model * vec4(posit mediump GLSLIFY 1\\n\\nunifo vec3 main() {\\n gl_FragCol = i(t){for(v null;for(v function() E=new new s(\"\",\"Inva data type for attribute \"+h+\": new s(\"\",\"Unkn data type for attribute \"+h+\": \"+f);var new s(\"\",\"Inva data type for attribute \"+h+\": n(t){retur new i(t,e){for r=new new s(\"\",\"Inva uniform dimension type for matrix \"+name+\": new s(\"\",\"Unkn uniform data type for \"+name+\": \"+r)}var new s(\"\",\"Inva data new data type for vector \"+name+\": r=[];for(v n in e){var r}function h(e){for(v n=[\"return function new s(\"\",\"Inva data new s(\"\",\"Inva uniform dimension type for matrix \"+name+\": \"+t);retur i(r*r,0)}t new s(\"\",\"Unkn uniform data type for \"+name+\": \"+t)}}func i){var p(t){var r=0;r1){l[ u=1;u1)for l=0;l=0){v t||t}funct s(t){funct r(){for(va u=0;u 1.0) {\\n discard;\\n }\\n baseColor = color, step(radiu gl_FragCol = * baseColor. mediump GLSLIFY 1\\n\\nattri vec2 vec4 mat3 float vec4 vec4 main() {\\n vec3 hgPosition = matrix * vec3(posit 1);\\n gl_Positio = 0, gl_PointSi = pointSize; vec4 id = pickId + pickOffset id.y += floor(id.x / 256.0);\\n id.x -= floor(id.x / 256.0) * 256.0;\\n\\n id.z += floor(id.y / 256.0);\\n id.y -= floor(id.y / 256.0) * 256.0;\\n\\n id.w += floor(id.z / 256.0);\\n id.z -= floor(id.z / 256.0) * 256.0;\\n\\n fragId = mediump GLSLIFY 1\\n\\nvaryi vec4 main() {\\n float radius = length(2.0 * - 1.0);\\n if(radius > 1.0) {\\n discard;\\n }\\n gl_FragCol = fragId / i(t,e){var instanceof instanceof null;var n(t,e,r,n) highp GLSLIFY 1\\n\\n\\nvec posHi, vec2 posLo, vec2 scHi, vec2 scLo, vec2 trHi, vec2 trLo) {\\n return vec4((posH + trHi) * scHi\\n \\t\\t\\t//FI this thingy does not give noticeable precision gain, need test\\n + (posLo + trLo) * scHi\\n + (posHi + trHi) * scLo\\n + (posLo + trLo) * scLo\\n , 0, vec2 positionHi float size, vec2 char, is 64-bit form of scale and vec2 scaleHi, scaleLo, translateH float vec4 sampler2D vec4 charColor, vec2 vec2 float float main() {\\n charColor = vec2(color / 255., 0));\\n borderColo = vec2(color / 255., 0));\\n\\n gl_PointSi = size * pixelRatio pointSize = size * charId = char;\\n borderWidt = border;\\n\\ gl_Positio = positionHi positionLo scaleHi, scaleLo,\\n translateH pointCoord = viewBox.xy + (viewBox.z - viewBox.xy * * .5 + highp GLSLIFY 1\\n\\nunifo sampler2D vec2 float charsStep, pixelRatio vec4 vec4 vec2 vec2 float float main() {\\n\\tvec2 pointUV = (pointCoor - + pointSize * .5) / = 1. - texCoord = ((charId + pointUV) * charsStep) / dist = alpha\\n\\ti (dist t;){var w.push(new i(){var a(t,e){var e=void null;var number of characters is more than maximum texture size. Try reducing x=0;x 1.0) {\\n discard;\\n }\\n vec4 baseColor = color, float alpha = 1.0 - pow(1.0 - baseColor. fragWeight gl_FragCol = * alpha, highp GLSLIFY 1\\n\\nvec4 pfx_1_0(ve scaleHi, vec2 scaleLo, vec2 translateH vec2 translateL vec2 positionHi vec2 positionLo {\\n return + translateH * scaleHi\\n + (positionL + translateL * scaleHi\\n + (positionH + translateH * scaleLo\\n + (positionL + translateL * scaleLo, 0.0, vec2 positionHi vec4 vec2 scaleHi, scaleLo, translateH float vec4 vec4 main() {\\n\\n vec4 id = pickId + pickOffset id.y += floor(id.x / 256.0);\\n id.x -= floor(id.x / 256.0) * 256.0;\\n\\n id.z += floor(id.y / 256.0);\\n id.y -= floor(id.y / 256.0) * 256.0;\\n\\n id.w += floor(id.z / 256.0);\\n id.z -= floor(id.z / 256.0) * 256.0;\\n\\n gl_Positio = scaleLo, translateH translateL positionHi positionLo gl_PointSi = pointSize; fragId = mediump GLSLIFY 1\\n\\nvaryi vec4 main() {\\n float radius = length(2.0 * - 1.0);\\n if(radius > 1.0) {\\n discard;\\n }\\n gl_FragCol = fragId / i(t,e){var e(e,r){ret e in n(t,e){var in r)return r[t];for(v o=r.gl d(t){var null;var a(t,e){ret new E=new i(t,e){var r=new n(t);retur 0.0 ||\\n || {\\n discard;\\n }\\n\\n vec3 N = vec3 V = vec3 L = {\\n N = -N;\\n }\\n\\n float specular = V, N, roughness) float diffuse = min(kambie + kdiffuse * max(dot(N, L), 0.0), 1.0);\\n\\n //decide how to interpolat color \\u2014 in vertex or in fragment\\n vec4 surfaceCol = .5) * vec2(value value)) + step(.5, vertexColo * vColor;\\n\\ vec4 litColor = surfaceCol * vec4(diffu * + kspecular * vec3(1,1,1 * specular, 1.0);\\n\\n gl_FragCol = mix(litCol contourCol contourTin * mediump GLSLIFY 1\\n\\nattri vec4 uv;\\nattri float f;\\n\\nunif mat3 mat4 model, view, float height, sampler2D float value, kill;\\nvar vec3 vec2 vec3 eyeDirecti vec4 main() {\\n vec3 dataCoordi = permutatio * vec3(uv.xy height);\\n vec4 worldPosit = model * 1.0);\\n\\n vec4 clipPositi = projection * view * clipPositi = clipPositi + zOffset;\\n gl_Positio = value = f;\\n kill = -1.0;\\n = = uv.zw;\\n\\n vColor = vec2(value value));\\n //Don't do lighting for contours\\n surfaceNor = vec3(1,0,0 eyeDirecti = vec3(0,1,0 lightDirec = mediump GLSLIFY 1\\n\\nunifo vec2 vec3 float float value, kill;\\nvar vec3 vec2 vec3 v) {\\n float vh = 255.0 * v;\\n float upper = floor(vh); float lower = fract(vh); return vec2(upper / 255.0, floor(lowe * 16.0) / main() {\\n if(kill > 0.0 ||\\n || {\\n discard;\\n }\\n vec2 ux = / shape.x);\\ vec2 uy = / shape.y);\\ gl_FragCol = vec4(pickI ux.x, uy.x, ux.y + i(t){var o(t,e){var new invalid coordinate for new Invalid texture size\");ret s(t,e){ret new Invalid ndarray, must be 2d or 3d\");var new Invalid shape for new Invalid shape for pixel new Incompatib texture format for new Invalid texture new Floating point textures not supported on this platform\") s=u(t);ret s=u(t);ret f(t,e){var new Invalid texture size\");var new Invalid shape for new Invalid shape for pixel b=u(t);ret new Error(\"gl- Too many vertex n(t,e,r){v i=new n(t){for(v n(t,e){var n(t,e,r){v instanceof a=new a(t,e){ret o(t){for(v e=[\"functi orient(){v orient\");v n=new a(t,e){var o(t,e){var s(t,e){var i}}functio c(t,e){for s(this,t); s(this,t); b}for(var r}return n}return l}function i(t,e,r,n) n(t,e){var r;if(h(t)) new Error('Unk function type -1 and 1 => 1\\n // In the texture normal, x is 0 if the normal points straight up/down and 1 if it's a round cap\\n // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = mod(a_pos, 2.0);\\n normal.y = sign(norma - 0.5);\\n v_normal = normal;\\n\\ float inset = u_gapwidth + (u_gapwidt > 0.0 ? u_antialia : 0.0);\\n float outset = u_gapwidth + u_linewidt * (u_gapwidt > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset = u_offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n // Remove the texture normal bit of the position before scaling it with the\\n // model/view matrix.\\n gl_Positio = u_matrix * * 0.5) + (offset + dist) / u_ratio, 0.0, 1.0);\\n\\n // position of y on the screen\\n float y = gl_Positio / // how much features are squished in the y direction by the tilt\\n float squish_sca = / * // how much features are squished in all directions by the float = 1.0 / (1.0 - min(y * u_extra, 0.9));\\n\\n v_linewidt = vec2(outse inset);\\n v_gamma_sc = * mediump lowp\\n#def float vec2 vec2 vec2 vec2 vec2 vec2 float float sampler2D vec2 vec2 float float main() {\\n // Calculate the distance of the pixel from the line in pixels.\\n float dist = * // Calculate the antialiasi fade factor. This is either when fading in\\n // the line in case of an offset line or when fading out\\n // float blur = u_blur * float alpha = clamp(min( - (v_linewid - blur), v_linewidt - dist) / blur, 0.0, 1.0);\\n\\n float x_a = / 1.0);\\n float x_b = / 1.0);\\n float y_a = 0.5 + (v_normal. * v_linewidt / float y_b = 0.5 + (v_normal. * v_linewidt / vec2 pos_a = vec2(x_a, y_a));\\n vec2 pos_b = vec2(x_b, y_b));\\n\\n vec4 color = pos_a), pos_b), u_fade);\\n alpha *= u_opacity; gl_FragCol = color * gl_FragCol = highp lowp\\n#def floor(127 / 2) == 63.0\\n// the maximum allowed miter limit is 2.0 at the moment. the extrude normal is\\n// stored in a byte (-128..127 we scale regular normals up to length 63, but\\n// there are also \\\"special\\ normals that have a bigger length (of up to 126 in\\n// this case).\\n// #define scale 63.0\\n#def scale We scale the distance before adding it to the buffers so that we can store\\n// long distances for long segments. Use this value to unscale the vec2 vec4 mat4 mediump float mediump float mediump float mediump float mediump float mat2 mediump float vec2 vec2 float float main() {\\n vec2 a_extrude = a_data.xy - 128.0;\\n float a_directio = mod(a_data 4.0) - 1.0;\\n float a_linesofa = / 4.0) + a_data.w * 64.0) * // We store the texture normals in the most insignific bit\\n // transform y so that 0 => -1 and 1 => 1\\n // In the texture normal, x is 0 if the normal points straight up/down and 1 if it's a round cap\\n // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = mod(a_pos, 2.0);\\n normal.y = sign(norma - 0.5);\\n v_normal = normal;\\n\\ float inset = u_gapwidth + (u_gapwidt > 0.0 ? u_antialia : 0.0);\\n float outset = u_gapwidth + u_linewidt * (u_gapwidt > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset = u_offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n // Remove the texture normal bit of the position before scaling it with the\\n // model/view matrix.\\n gl_Positio = u_matrix * * 0.5) + (offset + dist) / u_ratio, 0.0, 1.0);\\n v_linesofa = // position of y on the screen\\n float y = gl_Positio / // how much features are squished in the y direction by the tilt\\n float squish_sca = / * // how much features are squished in all directions by the float = 1.0 / (1.0 - min(y * u_extra, 0.9));\\n\\n v_linewidt = vec2(outse inset);\\n v_gamma_sc = * mediump lowp\\n#def lowp vec4 lowp float float sampler2D float float vec2 vec2 vec2 vec2 float main() {\\n // Calculate the distance of the pixel from the line in pixels.\\n float dist = * // Calculate the antialiasi fade factor. This is either when fading in\\n // the line in case of an offset line or when fading out\\n // float blur = u_blur * float alpha = clamp(min( - (v_linewid - blur), v_linewidt - dist) / blur, 0.0, 1.0);\\n\\n float sdfdist_a = v_tex_a).a float sdfdist_b = v_tex_b).a float sdfdist = mix(sdfdis sdfdist_b, u_mix);\\n alpha *= smoothstep - u_sdfgamma 0.5 + u_sdfgamma sdfdist);\\ gl_FragCol = u_color * (alpha * gl_FragCol = highp lowp\\n#def floor(127 / 2) == 63.0\\n// the maximum allowed miter limit is 2.0 at the moment. the extrude normal is\\n// stored in a byte (-128..127 we scale regular normals up to length 63, but\\n// there are also \\\"special\\ normals that have a bigger length (of up to 126 in\\n// this case).\\n// #define scale 63.0\\n#def scale We scale the distance before adding it to the buffers so that we can store\\n// long distances for long segments. Use this value to unscale the vec2 vec4 mat4 mediump float mediump float mediump float mediump float vec2 float vec2 float float mat2 mediump float vec2 vec2 vec2 vec2 float main() {\\n vec2 a_extrude = a_data.xy - 128.0;\\n float a_directio = mod(a_data 4.0) - 1.0;\\n float a_linesofa = / 4.0) + a_data.w * 64.0) * // We store the texture normals in the most insignific bit\\n // transform y so that 0 => -1 and 1 => 1\\n // In the texture normal, x is 0 if the normal points straight up/down and 1 if it's a round cap\\n // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = mod(a_pos, 2.0);\\n normal.y = sign(norma - 0.5);\\n v_normal = normal;\\n\\ float inset = u_gapwidth + (u_gapwidt > 0.0 ? u_antialia : 0.0);\\n float outset = u_gapwidth + u_linewidt * (u_gapwidt > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset = u_offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n // Remove the texture normal bit of the position before scaling it with the\\n // model/view matrix.\\n gl_Positio = u_matrix * * 0.5) + (offset + dist) / u_ratio, 0.0, 1.0);\\n\\n v_tex_a = * normal.y * + u_tex_y_a) v_tex_b = * normal.y * + // position of y on the screen\\n float y = gl_Positio / // how much features are squished in the y direction by the tilt\\n float squish_sca = / * // how much features are squished in all directions by the float = 1.0 / (1.0 - min(y * u_extra, 0.9));\\n\\n v_linewidt = vec2(outse inset);\\n v_gamma_sc = * mediump lowp\\n#def mapbox: define lowp vec4 mapbox: define lowp float vec2 v_pos;\\n\\n main() {\\n #pragma mapbox: initialize lowp vec4 #pragma mapbox: initialize lowp float opacity\\n\\ float dist = length(v_p - float alpha = 0.0, dist);\\n gl_FragCol = outline_co * (alpha * gl_FragCol = highp lowp\\n#def vec2 mat4 vec2 vec2 mapbox: define lowp vec4 mapbox: define lowp float main() {\\n #pragma mapbox: initialize lowp vec4 #pragma mapbox: initialize lowp float opacity\\n\\ gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n v_pos = / gl_Positio + 1.0) / 2.0 * mediump lowp\\n#def float vec2 vec2 vec2 vec2 float sampler2D vec2 vec2 vec2 v_pos;\\n\\n main() {\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos = imagecoord vec4 color1 = pos);\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos2 = vec4 color2 = pos2);\\n\\n // find distance to outline for alpha float dist = length(v_p - float alpha = 0.0, dist);\\n \\n\\n gl_FragCol = mix(color1 color2, u_mix) * alpha * gl_FragCol = highp lowp\\n#def vec2 vec2 vec2 vec2 float float float vec2 mat4 vec2 vec2 vec2 vec2 v_pos;\\n\\n main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n vec2 scaled_siz = u_scale_a * vec2 scaled_siz = u_scale_b * // the correct offset needs to be calculated //\\n // The offset depends on how many pixels are between the world origin and\\n // the edge of the tile:\\n // vec2 offset = size)\\n //\\n // At high zoom levels there are a ton of pixels between the world origin\\n // and the edge of the tile. The glsl spec only guarantees 16 bits of\\n // precision for highp floats. We need more than that.\\n //\\n // The pixel_coor is passed in as two 16 bit values:\\n // = / 2^16)\\n // = 2^16)\\n //\\n // The offset is calculated in a series of steps that should preserve this precision: vec2 offset_a = scaled_siz * 256.0, scaled_siz * 256.0 + vec2 offset_b = scaled_siz * 256.0, scaled_siz * 256.0 + v_pos_a = * a_pos + offset_a) / v_pos_b = * a_pos + offset_b) / v_pos = / gl_Positio + 1.0) / 2.0 * mediump lowp\\n#def float vec2 vec2 vec2 vec2 float sampler2D vec2 vec2 main() {\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos = imagecoord vec4 color1 = pos);\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos2 = vec4 color2 = pos2);\\n\\n gl_FragCol = mix(color1 color2, u_mix) * gl_FragCol = highp lowp\\n#def mat4 vec2 vec2 vec2 vec2 float float float vec2 vec2 vec2 main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n vec2 scaled_siz = u_scale_a * vec2 scaled_siz = u_scale_b * // the correct offset needs to be calculated //\\n // The offset depends on how many pixels are between the world origin and\\n // the edge of the tile:\\n // vec2 offset = size)\\n //\\n // At high zoom levels there are a ton of pixels between the world origin\\n // and the edge of the tile. The glsl spec only guarantees 16 bits of\\n // precision for highp floats. We need more than that.\\n //\\n // The pixel_coor is passed in as two 16 bit values:\\n // = / 2^16)\\n // = 2^16)\\n //\\n // The offset is calculated in a series of steps that should preserve this precision: vec2 offset_a = scaled_siz * 256.0, scaled_siz * 256.0 + vec2 offset_b = scaled_siz * 256.0, scaled_siz * 256.0 + v_pos_a = * a_pos + offset_a) / v_pos_b = * a_pos + offset_b) / mediump lowp\\n#def float float sampler2D sampler2D vec2 vec2 float float float float vec3 main() {\\n\\n // read and cross-fade colors from the main and parent tiles\\n vec4 color0 = v_pos0);\\n vec4 color1 = v_pos1);\\n vec4 color = color0 * u_opacity0 + color1 * u_opacity1 vec3 rgb = color.rgb; // spin\\n rgb = vec3(\\n dot(rgb, dot(rgb, dot(rgb, // saturation float average = (color.r + color.g + color.b) / 3.0;\\n rgb += (average - rgb) * // contrast\\n rgb = (rgb - 0.5) * + 0.5;\\n\\n // brightness vec3 u_high_vec = vec3 u_low_vec = gl_FragCol = u_low_vec, rgb), gl_FragCol = highp lowp\\n#def mat4 vec2 float float vec2 vec2 vec2 vec2 main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n v_pos0 = / 32767.0) - 0.5) / u_buffer_s ) + 0.5;\\n v_pos1 = (v_pos0 * + mediump lowp\\n#def sampler2D sampler2D lowp float vec2 vec2 main() {\\n lowp float alpha = v_fade_tex * u_opacity; gl_FragCol = v_tex) * gl_FragCol = highp lowp\\n#def vec2 vec2 vec2 vec4 matrix is for the vertex mat4 mediump float bool vec2 vec2 vec2 vec2 main() {\\n vec2 a_tex = mediump float a_labelmin = a_data[0]; mediump vec2 a_zoom = a_data.pq; mediump float a_minzoom = a_zoom[0]; mediump float a_maxzoom = a_zoom[1]; // u_zoom is the current zoom level adjusted for the change in font size\\n mediump float z = 2.0 - u_zoom) - (1.0 - u_zoom));\\ vec2 extrude = * (a_offset / 64.0);\\n if {\\n gl_Positio = u_matrix * vec4(a_pos + extrude, 0, 1);\\n gl_Positio += z * } else {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1) + vec4(extru 0, 0);\\n }\\n\\n v_tex = a_tex / u_texsize; v_fade_tex = / 255.0, mediump lowp\\n#def sampler2D sampler2D lowp vec4 lowp float lowp float lowp float vec2 vec2 float main() {\\n lowp float dist = v_tex).a;\\ lowp float fade_alpha = lowp float gamma = u_gamma * lowp float alpha = - gamma, u_buffer + gamma, dist) * gl_FragCol = u_color * (alpha * gl_FragCol = highp lowp\\n#def float PI = vec2 vec2 vec2 vec4 matrix is for the vertex mat4 mediump float bool bool mediump float mediump float mediump float vec2 vec2 vec2 vec2 float main() {\\n vec2 a_tex = mediump float a_labelmin = a_data[0]; mediump vec2 a_zoom = a_data.pq; mediump float a_minzoom = a_zoom[0]; mediump float a_maxzoom = a_zoom[1]; // u_zoom is the current zoom level adjusted for the change in font size\\n mediump float z = 2.0 - u_zoom) - (1.0 - u_zoom));\\ // map\\n // map | viewport\\n if {\\n lowp float angle = ? (a_data[1] / 256.0 * 2.0 * PI) : u_bearing; lowp float asin = sin(angle) lowp float acos = cos(angle) mat2 RotationMa = mat2(acos, asin, -1.0 * asin, acos);\\n vec2 offset = RotationMa * a_offset;\\ vec2 extrude = * (offset / 64.0);\\n gl_Positio = u_matrix * vec4(a_pos + extrude, 0, 1);\\n gl_Positio += z * // viewport\\n // map\\n } else if {\\n // foreshorte factor to apply on pitched maps\\n // as a label goes from horizontal vertical in angle\\n // it goes from 0% foreshorte to up to around 70% lowp float pitchfacto = 1.0 - cos(u_pitc * sin(u_pitc * 0.75));\\n\\ lowp float lineangle = a_data[1] / 256.0 * 2.0 * PI;\\n\\n // use the lineangle to position points a,b along the line\\n // project the points and calculate the label angle in projected space\\n // this calculatio allows labels to be rendered unskewed on pitched maps\\n vec4 a = u_matrix * vec4(a_pos 0, 1);\\n vec4 b = u_matrix * vec4(a_pos + 0, 1);\\n lowp float angle = - b[0]/b[3] - a[0]/a[3]) lowp float asin = sin(angle) lowp float acos = cos(angle) mat2 RotationMa = mat2(acos, -1.0 * asin, asin, acos);\\n\\n vec2 offset = RotationMa * 1.0) * a_offset); vec2 extrude = * (offset / 64.0);\\n gl_Positio = u_matrix * vec4(a_pos 0, 1) + vec4(extru 0, 0);\\n gl_Positio += z * // viewport\\n // viewport\\n } else {\\n vec2 extrude = * (a_offset / 64.0);\\n gl_Positio = u_matrix * vec4(a_pos 0, 1) + vec4(extru 0, 0);\\n }\\n\\n v_gamma_sc = (gl_Positi - 0.5);\\n\\n v_tex = a_tex / u_texsize; v_fade_tex = / 255.0, mediump lowp\\n#def float float float float main() {\\n\\n float alpha = 0.5;\\n\\n gl_FragCol = vec4(0.0, 1.0, 0.0, 1.0) * alpha;\\n\\n if > u_zoom) {\\n gl_FragCol = vec4(1.0, 0.0, 0.0, 1.0) * alpha;\\n }\\n\\n if (u_zoom >= v_max_zoom {\\n gl_FragCol = vec4(0.0, 0.0, 0.0, 1.0) * alpha * 0.25;\\n }\\n\\n if >= u_maxzoom) {\\n gl_FragCol = vec4(0.0, 0.0, 1.0, 1.0) * alpha * 0.2;\\n highp lowp\\n#def vec2 vec2 vec2 mat4 float float float main() {\\n gl_Positio = u_matrix * vec4(a_pos + a_extrude / u_scale, 0.0, 1.0);\\n\\n v_max_zoom = a_data.x;\\ = vec4 values, const float t) {\\n if (t 7)return[n have been deprecated as of v8\")];if(! in \"%s\" not strict\";va a(l,e,\"arr expected, %s a(l,e,\"arr length %d expected, length %d r?[new have been deprecated as of v8\")]:[];v n(e,r,\"obj expected, %s found\",a)] o=[];for(v s in must start with \"@\"'));ret strict\";va one of [%s], %s strict\";va t(e){var n(l,s,\"arr expected, %s n(l,s,'\"$t cannot be use with operator n(l,s,'fil array for operator \"%s\" must have 3 expected, %s key cannot be a functions not functions not strict\";va url must include a \"{fontstac url must include a \"{range}\" strict\";va n(c,r,'eit \"type\" or \"ref\" is i(e,r,\"%s is greater than the maximum value strict\";va n(e,r,\"obj expected, %s f in r){var property in n(e,r,'mis required property strict\";va i(e,o,'unk property strict\";va n(r,e,'\"ty is e)for(var c in a(t){retur Sans Unicode MS new new M=new in n){for(var many symbols being rendered in a tile. See many glyphs being rendered in a tile. See exceeds allowed extent, reduce your vector tile buffer size\")}ret new new Error(\"Inv LngLat object: (\"+t+\", new new x(){return y(){return point(){re new new new new instanceof 0===s&&voi a(void new Error(\"fai to invert strict\";va n={\" strict\";va s(t){retur l(t,e,r,n) o=(new out of n(t,e){ret mapbox: ([\\w]+) ([\\w]+) ([\\w]+) a=new n?e(new Error(\"Inp data is not a valid GeoJSON t.data)ret e(new Error(\"Inp data is not a valid GeoJSON e(new Error(\"Inp data is not a valid GeoJSON e=0;ee)){v y;for(y in in p)c[y]=!0; t in new new i(t,e,i){v r(t,r){ret delete e(t);var n=new o(new new e=new in tile source layer \"'+M+'\" does not use vector tile spec v2 and therefore may have some rendering g(t,L);var F in B in n=new t.time>=(n void void t=new new i;var strict\";va new Error(\"Inv color o[e]}throw new Error(\"Inv color void n in r in Error('Sou layer does not exist on source \"'+e.id+'\" as specified by style layer t in t.id});for new Error(\"Sty is not done new Error(\"The is no source with this ID\");var delete instanceof this;var 0===e)thro new Error(\"The is no layer with this ID\");for(v r in this;var void 0===i||voi 0===a?void strict\";va i(t){retur t.value}va r,n;for(va i in t){var in for(n in in in 0===e)dele 0===e)dele o}var strict\";va new t){var this.grid= a}if(r){va _=u;for(va a}}}return r=new r(\"glyphs > 65535 not i=!t&&new l(new c(new g(e,r){var y(e,r){var i(0,0));re M in a)t[M]=new strict\";va t){var | n(){}var i(t){retur new 61:case 107:case 171:case 189:case 109:case t=0,e=0;re t=new null!==t&& new Error(\"max must be between the current minZoom and 20, t,e={};ret t instanceof e;if(t instanceof instanceof c?t:new i(this,e); void Error(\"Fai to initialize s in if(void if(void n(t){var r=new n(t){for(v e=0;e1)for delete error c(t,e,r){v f(t,e){for t in null;var delete new Error(\"An API access token is required to use Mapbox GL. See new Error(\"Use a public access token (pk.*) with Mapbox GL JS, not a secret access token (sk.*). See t}function i(t){retur a(t){retur t;var n(t){funct v[n];void in t=0;t=1)re 1;var void t={};for(v e in =0.22.0 =0.22.0 No README data run build-docs # invoked by publisher when publishing docs on the mb-pages --debug --standalo mapboxgl > && tap --no-cover build --github --format html -c --theme ./docs/_th --output --debug -t unassertif --plugin [minifyify --map --output --standalo mapboxgl > && tap --no-cover --debug -t envify > --ignore-p .gitignore js test bench diff --name-onl mb-pages HEAD -- | awk '{print | xargs build-toke watch-dev watch-benc build-toke watch-benc build-toke watch-dev run build-min && npm run build-docs && jekyll serve --no-cache --localhos --port 9966 --index index.html .\",test:\"n run lint && tap --reporter dot test/js/*/ && node && watchify bench/inde --plugin [minifyify --no-map] -t [babelify --presets react] -t unassertif -t envify -o bench/benc --debug --standalo mapboxgl -o n=new r=new r(t){var n(t,n){var i(t){retur t)return t){var 1=0)return V=1;V specify vertex creation specify cell creation specify phase strict\";va n(t){if(t in l)return l[t];for(v new Invalid boundary dst;};retu t in l){var t in u){var t in c){var return \"+s),u){va p=new p=new p()}functi for(var o=0;o1)for f(e,r){var s=\"__l\"+ i=\"__l\"+ _=[\"'use L=new L=new L(r)}funct s(t,e){var r=[\"'use [2,1,0];}e [1,0,2];}} [2,0,1];}e new new function new o=new 0===t){var 0===r){r=n o(t,e){var s(t,e){ret a(t,e){var i=new t||\"up\"in strict\";va r=void 0!==r?r+\"\" e(t,e){for t}function o)throw new to path.resol must be t)throw new to path.join must be n(t){for(v new Error(\"Giv varint doesn't fit into 10 bytes\");va o(t,e,r){v s(t,e){for new type: void n(t){var 0:return r||[];case 1:return 2:return Array(t);v r}var r(t,e){var Array(a),n n(t,e){for a(t){for(v t-e});var new t instanceof i(t){retur a(t){for(v a=1;i;){va l(t){for(v c(t){retur d(t){var u(m)}funct p(t){var 0x80 (not a basic code x});else for(_ in n(t,e){ret o;var o};var n(t,e){for n&&void e(t){var e=new Error(\"(re \"+t);throw n(t){retur t?\": i(t,r,i){t in r||e(\"unkn parameter possible values: parameter type\"+n(r) must be a typed parameter type\"+n(i) expected \"+r+\", got \"+typeof t)}functio parameter type, must be a nonnegativ shader source must be a string\",a) number \"+t+\": r=0;e(c(\"| compiling \"+s+\" shader, linking program with vertex shader, and fragment shader i(t){retur M(t,r){var n=m();e(t+ in command called from \"+n))}func A(t,e,r,i) in e||M(\"unkn parameter possible values: parameter type\"+n(r) expected \"+e+\", got \"+typeof texture format for renderbuff format for L(t,e){ret z(t,e,n){v pixel arguments to document,\" manually specify webgl context outside of DOM not supported, try upgrading your browser or graphics drivers name must be string\");v $(t){var et(t,e){va _e:r=new we:r=new Me:r=new ke:r=new Ae:r=new Te:r=new Se:r=new null}retur n=0;n0){va t[0]){var buffer data\")}els shape\");va data for buffer p=new n(a);retur d=[];retur t=0;return t&&t._buff instanceof a(t){var e||(e=new Ge:case Xe:case Ze:case type for element bit element buffers not supported, enable first\");va vertex count for buffer a}var t&&t._elem instanceof pt(t){for( At(t){retu Tt(t,e){va Or:case Fr:case Rr:case jr:var texture type, must specify a typed St(t,e){re for(var s}return o*r*n}func texture texture unpack n){var must enable the extension in order to use floating point must enable the extension in order to use 16-bit floating point must enable the extension in order to use depth/sten texture must be an extension not extension not d(e,r,i){v m(){return K.pop()||n h}function y(t,e,r){v b(t,e){var e){var e){var e){var e){var e){var i(t,e){var arguments to format for c=new T(nr);retu format for C=new z=new I(){for(va for(var P={\"don't care\":$r,\" mipmap mipmap mipmap mipmap s3tc dxt1\":Mr,\" s3tc dxt1\":kr,\" s3tc dxt3\":Ar,\" s3tc atc\":Sr,\"r atc explicit atc interpolat pvrtc pvrtc pvrtc pvrtc etc1\"]=Pr) r=B[e];ret null});ret number of texture shape for z||\"colors render targets not color buffer must enable in order to use floating point framebuffe must enable in order to use 16-bit floating point framebuffe must enable to use 16-bit render must enable in order to use 32-bit floating point color color format for color format for extension not u=d=1;var for(D=new color attachment \"+a+\" is color attachment much have the same number of bits per depth attachment for framebuffe stencil attachment for framebuffe depth-sten attachment for framebuffe not resize a framebuffe which is currently in use\");var i;for(var shape for framebuffe must be be d||\"colors render targets not color buffer color color format for l=1;var a(t){var t=0;return vertex fragment shader\",n) a=i[t];ret a||(a=new o(o){var must create a webgl context with in order to read pixels from the drawing cannot read from a from a framebuffe is only allowed for the types 'uint8' and from a framebuffe is only allowed for the type 'uint8'\")) arguments to buffer for regl.read( too s(t){var r;return l(t){retur l}function jt(t){retu Nt(t){retu Bt(){funct t(t){for(v r(){functi n(){var e=a();retu n(){var new new m(t){retur v(t,e,r){v g(t,e,r){v y(){var ei:var ri:return ni:return ii:return ai:return c={};retur n=e.id(t); in c)return c[n];var b(t){var in r){var if(Di in n){var e}function x(t,e){var in r){var i=r[Pi];re framebuffe in n){var a=n[Pi];re framebuffe null}funct n(t){if(t in i){var in a){var \"+t)});var in in e?new s=o;o=new w(t){funct r(t){if(t in i){var r});return n.id=r,n}i in a){var o=a[t];ret null}var r(t,r){if( in n){var in i){var s=i[t];ret in n){var in i){var o=i[Ri];re in n){var t=n[ji];re Be[t]})}if in i){var r=i[ji];re in \"+n,\"inval primitive, must be one of Aa}):new in n){var vertex t})}if(Ni in i){var r=i[Ni];re vertex s?new vertex offset/ele buffer too l=new k(t,e){var o(e,n){if( in r){var o})}else if(t in i){var vi:case si:case oi:case Ai:case hi:case Ci:case xi:case wi:case Mi:case pi:return flag fi:return in \"+i,\"inval \"+t+\", must be one of di:return color attachment for framebuffe sent to uniform data for uniform a[r],\"inva uniform or missing data for uniform T(t,r){var a&&a,\"inva data for attribute offset for attribute divisor for attribute parameter \"'+r+'\" for attribute pointer \"'+t+'\" (valid parameters are in r)return r[s];var in '+a+\"&&(ty dynamic attribute if(\"consta in \"+a+'.cons === in S(t){var a(t){var parameter L(t,e,r){v C(t,e,r,n) z(t,e,r){v n=m(e);if( in r.state)){ c,h;if(n in in I(t,e,r,n) if(mt(u)){ l(t){var ua:case da:case ga:return 2;case ca:case pa:case ya:return 3;case ha:case ma:case ba:return 1}}functio attribute i(i){var a=c[i];ret a(){functi o(){functi vertex vertex vertex i(t){retur n(e){var n=r.draw[e s(t){funct e(t){var args to args to e(t){if(t in r){var e=r[t];del delete l(t,e){var regl.clear with no buffer takes an object as cancel a frame callback must be a h(){var callback must be a function\") event, must be one of Kt={\"[obje renderbuff renderbuff arguments to renderbuff r(){return i(t){var s(){return p.pop()||n o}function u(t,e,r){v c(){var t(){var new requires at least one argument; got none.\");va e.href;var \",e);var s=new o;n=-(i+a) null;var n(t){retur n(t){for(v R;};return i(t){var e=s[t];ret strict\";\"u n(t){for(v i}function h(t,e){for r=new r}function r=new l(e)}funct u(t){for(v e=s(t);;){ t=k[0];ret f(t,e){var r=k[t];ret n(t,e){var l}else if(u)retur l}else if(u)retur u;return i(t,e){ret t.y-e}func a(t,e){for r=null;t;) t;var r}function l(t){for(v n=d.index; n(t,e){var i(t,e,r,n) o(t,e){for r}function s(t,e){for m}function s[t];for(v new unexpected new failed to parse named argument new failed to parse named argument new mixing positional and named placeholde is not (yet) s[t]=n}var n(t){for(v Array(e),n Array(e),i Array(e),a Array(e),o Array(e),s x=new u(t){retur c(t){var h(t){retur f(t){var d(t,e){for r in t}function p(t){retur t.x}functi m(t){retur t.y}var time\");var r=\"prepare \"+t.length %d clusters in c)|0 p=new Array(r),m Array(r),v Array(r),g p=new o}function s}function T(t){retur n=z(t);ret t){var r={};for(v i in e={};for(v r in n(t,e){var i(t,e){var s/6}return 1}var n&&void e(t,e){var for(a=0,n= n})}}var s;var in new Error(\"n must be new Error(\"alr s(t){retur new l(t){retur new u(t){retur new c(t){retur new h(t){retur new f(t){retur new d(t){retur new p(t){retur new m(t){retur x?new v(t){retur new n(t)}var null}retur t=0;tn)ret instanceof n)return t;var i=new n;return a(t){retur instanceof o(t,e){ret s(t,e){ret new 'url' must be a string, not \"+typeof t);var i(t,e){var a(t,e){var o(t,e){ret t}function s(t){var e={};retur a;var v=e.name?\" c(e)}var o+\": \"+s}functi d(t,e,r){v n=0;return \")+\" \"+t.join(\" \")+\" \"+t.join(\" \")+\" p(t){retur t}function v(t){retur g(t){retur t}function t}function t}function _(t){retur void 0===t}func w(t){retur M(t)&&\"[ob k(t){retur M(t)&&\"[ob A(t){retur instanceof t}function S(t){retur t||void 0===t}func E(t){retur L(t){retur t=a)return new Error(\"unk command if(7!==r)t new Error(\"unk command i(t){for(v e}var new Error(\"fea index out of new new String too long (sorry, this will get fixed later)\");v l(t){for(v e(t){var e=n(t);ret e?u in r(t,e){var o(t){var i?u in i&&delete t){var r?r[0]:\"\"} n?!r&&en)t al-ahad\",\" {0} not {0} {0} {0} mix {0} and {1} a(t,e){ret ;var format a date from another number at position name at position literal at position text found at dd M MM d, d M d M d M d M yyyy\",RSS: d M a=this;ret var _inline_1_ = - var _inline_1_ = - >= 0) !== (_inline_1 >= 0)) {\\n + 0.5 + 0.5 * (_inline_1 + _inline_1_ / (_inline_1 - }\\n n(t,e){var r=[];retur strict\";va u(r,i){ret i(t,e){var void E.remove() void null;var strict\";va void c();var t}function i(t){var e=x[t];ret a(t){retur the calendar system to use with `\"+t+\"` date data.\"}var i={};retur t}var i?\"rgba(\"+ n=i(t);ret t){var A(e,r){var T(){var void strict\";va strict\";va strict\";va strict\";va strict\";va strict\";va n(){var e(e){retur r;try{r=ne strict\";va i(t,e,r,n) a(t){var void n.remove() void \")}).split \")}).split scale(\"+e+ n,i,a;retu strict\";va 0 1,1 0 0,1 \"+a+\",\"+a+ 0 0 1 \"+a+\",\"+a+ 0 0 1 \"+r+\",\"+r+ 0 0 1 \"+r+\",\"+r+ 0 0 1 0 1,1 0 0,1 0 1,1 0 0,1 n(t,e,r,n) t.id});var strict\";va strict\";va i(t,e,r){v r(t){var void r.remove() r(e,r,o){v if(i[r]){v o;if(void strict\";va n(t){var n(r){retur strict\";va n(t){for(v \");var i(t,e){var click on legend to isolate individual l(t){var u(t){var strict\";va r[1]}retur i}function i(t){retur t[0]}var h(t){var f(t){var d(t){var n(t,e){var i(t){for(v n(t){for(v 0}}var o(t,e){var 0 1,1 0 0,1 extra params in segment t(e).repla strict\";va strict\";va u(r,i){ret r(t,e){ret l(t,e,r){v u(t,e,r){v c(t,e){var n(){return p(t,e){var g(t,e){ret y(t,e){ret b(t,e,r){v x(t,e){var _(t){for(v r(t,e){ret strict\";va strict\";va strict\";va t){var void t)return void void n}function l(t){retur u(t){retur c(t){retur d\")}functi h(t){retur d, yyyy\")}var t.getTime} r={};retur n=new a(t){retur o(t){for(v r={};retur n(){return strict\";va for(var c(t){retur void property r(t,e){var instanceof RegExp){va void o(t,e){ret t>=e}var binary r=e%1;retu n(t){var e=i(t);ret n(t,e){ret i(t){retur \")}functio a(t,e,r){v was an error in the tex null;var r=0;r1)for i=1;i doesnt match end tag . Pretending it did s}function c(t,e,r){v o(),void e();var 0,\":\"],k=n t(t,e){ret void n(t){var i(){var 1px new strict\";va strict\";va n(t,e){for r=new new Error(\"No DOM element with id '\"+t+\"' exists on the page.\");re 0===t)thro new Error(\"DOM element provided is null or previous rejected promises from t.yaxis1); array edits are incompatib with other edits\",h); full array edit out of if(void & removal are incompatib with edits to the same full object edit new Error(\"eac index in \"+r+\" must be new Error(\"gd. must be an 0===e)thro new is a required new Error(\"cur and new indices must be of equal u(t,e,r){v new Error(\"gd. must be an 0===e)thro new Error(\"tra must be in in i(t){retur a(t,e){var r=0;return new Error(\"Thi element is not a Plotly plot: \"+t+\". It's likely that you've failed to create a plot before animating it. For more details, see void c()}functi d(t){retur overwritin frame with a frame whose name of type \"number\" also equates to \"'+f+'\". This is valid but may potentiall lead to unexpected behavior since all plotly.js frame names are stored internally as This API call has yielded too many warnings. For the rest of this call, further warnings about numeric frame names will be addFrames accepts frames with numeric names, but the numbers areimplici cast to n(t){var i}function i(){var t={};retur a(t){var o(){var s(t){retur l(t){funct u(t){funct c(t){retur h(t,e,r){v f(t,e,r){v e={};retur t&&void n(t){retur Error(\"Hei and width should be pixel values.\")) l(t,e,r){v u(t,e,r,n) \"+o:s=o+(s dtick p(t,e){var c=new t.dtick){v error: t+i*e;var dtick a(t){for(v strict\";va v(r,n){ret to enter axis\")+\" e;var n(t,r){for n(t,e,r,n) u(t,e){ret y(t){var b(t,e,r){v back X(e,r){var K()}functi W(e){funct n(e){retur void k.log(\"Did not find wheel motion attributes \",e);var strict\";va n(t){retur t._id}func went wrong with axis Error(\"axi in in in o){var t(t){var e(t){retur strict\";va r(r,n){var e/2}}funct v(t,e){var g(t,e){var b(t,e){var x(t,e){var new Error(\"not yet r(t,r){for i(){for(va a(t,e){for n(t,e){var n(t){retur i(t,e,r,n) a(t,e){ret i(t,e){var r(t){retur n(t){var l(t,e){var u(t){var c(t,e){var f(t,e,r){v strict\";va i(t){var e=new n;return a(t){var o(t){var i=new n(t,e);ret strict\";va Sans Regular, Arial Unicode MS r(t,e){ret - delete t)return e,n,i={};f in i}return r=a(t);ret e&&delete P=(new + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + 0px\",\"1px -1px\",\"-1p 1px\",\"1px \"+t+\" 0 \"+n+\" \"+n+\" \"+n+\" void c=\"t: \"+u.t+\", r: l;var r in t)r in r in r=e||6;ret 0===t)retu null;var void t(){var t={};retur n.mode,del strict\";va e(e,i){ret s;return t(t,e){ret i=r[n];ret strict\";va t(t,e,r){v e(t,e){ret r(t,e){ret a(t,i){var tozoom back f(t,e){var i(t){retur y,b;return o,s;return ii))return e}return void h(t){retur f(t,e){ret void strict\";va strict\";va 0, 0, strict\";va s;return r in void strict\";va null;for(v strict\";va o(e){var s(e){var strict\";va n(t,e,r){v i(t,e,r){v strict\";va converged strict\";va strict\";va strict\";va strict\";va s(r,i){ret n(t,e,r,n) strict\";va n(t,e){for o(t){retur strict\";va void strict\";va strict\";va c(r,i){ret loop in contour?\") s(t,e,r){v 15===r?0:r many contours, clipping at i}function a(t,e,r){v o(t,e,r){v s(t,e,r,n) e=l(t,r) r(t){retur to newendpt is not vert. or perimeter scale is not scale is not void data invalid for the specified inequality many contours, clipping at strict\";va strict\";va h(t){retur to newendpt is not vert. or perimeter o(t,e,r){v s(t,e,r){v scale is not scale is not strict\";va iterated with no new in strict\";va g}var didn't converge strict\";va s=0;sa){va in strict\";va l(r,n){ret u(t){var e=l(t);ret strict\";va strict\";va e(e){var strict\";va strict\";va void r(t,e){ret traces support up to \"+u+\" dimensions at the c}var l(r,n){ret strict\";va l(n){var i}function c(t,e,r){v l(t,e,r){v n=o(r);ret u(t,e){ret c(t){retur h(t){var e=o(t);ret f(t){var d(t){retur t[0]}funct p(t,e,r){v m(t){var v(t){retur l(t){var u(t){retur c(t,e){for e.t+\"px \"+e.r+\"px \"+e.b+\"px 255, 255, 0)\");var 1px 1px #fff, -1px -1px 1px #fff, 1px -1px 1px #fff, -1px 1px 1px strict\";va i(t,e,r){v strict\";va n(t,e){for m};var strict\";va o(r,a){ret strict\";va strict\";va strict\";va n(t,e,r){v u;var 1;var a(t,e){var r(t,e){ret n(t,e){ret s(t,e){var 1;var t+\" void strict\";va strict\";va strict\";va 0, i(t,e){var r=new for(r=new is present in the Sankey data. Removing all nodes and strict\";va u(r,a){ret n(t){retur t.key}func a(t){retur t[0]}funct o(t){var 0 0 1 0 0)\":\"matri 1 1 0 0 0)\")}funct M(t){retur k(t){retur 0 0 1 0 0)\":\"matri 1 1 0 0 0)\"}functi A(t){retur 1)\":\"scale 1)\"}functi T(t){retur S(t){retur L(t,e,r){v var C(t,e,r){v i(){for(va e={};retur 1px 1px #fff, 1px 1px 1px #fff, 1px -1px 1px #fff, -1px -1px 1px strict\";va _=new strict\";va void strict\";va strict\";va m(r,a){ret strict\";va strict\";va strict\";va r(e){var i(t){var strict\";va n(t,e){var + m(t){retur v(t){retur g(t){retur t.id}funct g}function x(e){var scatter strict\";va s(t,e){ret l(t){retur M[t]}funct o=0;o=0){v n(t,e,r,n) strict\";va d(r,i){ret s=o[0];if( 0;var v.push(\"y: strict\";va strict\";va e(t){retur r(t){var 1/0;var strict\";va n(t,e){var n}function s(t,e,r,n) n=new s(t){var 1/0;var strict\";va strict\";va strict\";va strict\";va d(r,i){ret strict\";va strict\";va e=f(t);ret e=f(t);ret e=f(t);ret e=f(t);ret In\u00a0[4]: # Helper function for pulling data from quandl def '''Downloa and cache Quandl data series''' cache_path = try: f = 'rb') df = pickle.loa #print('Lo {} from except (OSError, IOError) as e: {} from df = #print('Ca {} at cache_path return df In\u00a0[5]: # Quandle codes for each dataset. data_sets = # Cool loop to define variables. dict_of_df = {} for item in data_sets: data_code = if item == \"EURGBP\": dataset = get_data( ) else: dataset = get_data( ) = dataset Format data\u00b6After importing the data it needs to be changed into a convenient form. In this notebook I download the price of Bitcoin in GBP, and the price of Ethereum and Litecoin in Bitcoin. Each of these is a different dataset on Quandl, so I copied the relevant data from each data set into one new data frame that contained everything I was interested The price of Ethereum or Litecoin in GBP is the product of their respective prices in Bitcoin and Bitcoin\u2019s price in GBP. I calculated this and created new columns to store the\u00a0result In\u00a0[6]: # helper_fun to take one column from many dfs and merge into a single new df def col): '''Merge a single column of each dataframe into a new combined dataframe' series_dic = {} for key in dict_of_df = return In\u00a0[7]: = # Merge opening price for each currency into a single dataframe df = 'Open') #df.tail() = np.NaN # convert to GBP and rename columns df['btc'] = df['gbp'] df['eth'] = df['gbp'] * df['eth_bt df['ltc'] = df['gbp'] * df['ltc_bt df['eur'] = df['EURGBP #df = axis=1) In\u00a0[8]: # put price data in its own df (\"prices\") to do growth analysis later # keep prices in GBP because GBP varies less than any other common measure. prices = 'eth', 'ltc', 'eth_btc', 'ltc_btc'] In\u00a0[9]: prices = Asset charts\u00b6Bit - \u00a3\u00b6 In\u00a0[10]: #end = #df = # Bitcoin price series1 = go.Scatter name='Pric line = dict( color = ('green'), width = 2)) series2 = go.Scatter name='7 day SMA', line = dict( color = ('blue'), width = 1)) series3 = go.Scatter name='30 day SMA', line = dict( color = ('red'), width = 1)) data = [series1, series2, series3] layout = go.Layout( title='Bit price', yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[10]: Ethereum - \u00a3\u00b6 In\u00a0[11]: #end = #df = # Ethereum price series1 = go.Scatter name='ETH' line = dict( color = ('green'), width = 2)) series2 = go.Scatter name='7 day SMA', line = dict( color = ('blue'), width = 1)) series3 = go.Scatter name='30 day SMA', line = dict( color = ('red'), width = 1)) data = [series1, series2, series3] layout = go.Layout( price', yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[11]: Ethereum - BTC\u00b6 In\u00a0[12]: #end = #df = #prices = 'eth', 'ltc', 'eth_btc', 'ltc_btc'] # Ethereum price series1 = go.Scatter line = dict( color = ('green'), width = 2)) series2 = go.Scatter name='7 day SMA', line = dict( color = ('blue'), width = 1)) series3 = go.Scatter name='30 day SMA', line = dict( color = ('red'), width = 1)) data = [series1, series2, series3] layout = go.Layout( / Bitcoin', yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[12]: LiteCoin - \u00a3\u00b6 In\u00a0[13]: #end = #df = # Litecoin price series1 = go.Scatter name='LTC' line = dict( color = ('green'), width = 2)) series2 = go.Scatter name='7 day SMA', line = dict( color = ('blue'), width = 1)) series3 = go.Scatter name='30 day SMA', line = dict( color = ('red'), width = 2)) data = [series1, series2, series3] layout = go.Layout( price', yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[13]: Litecoin - BTC\u00b6 In\u00a0[14]: # end = # df = win1 = 7 win2 = 30 # Litecoin price series1 = go.Scatter line = dict( color = ('green'), width = 1)) series2 = go.Scatter name='{} day line = dict( color = ('blue'), width = 2)) series3 = go.Scatter name='{} day line = dict( color = ('red'), width = 2)) data = [series1, series2, series3] layout = go.Layout( / Bitcoin', yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[14]: SMA Analysis\u00b6T price data for the digital assets above shows a high degree of variance, so when I first visualized the price history I wanted to smooth it somehow. I plotted a simple moving average and became curious what different types of average would look like. I noticed that the longer and shorter SMAs (Simply Moving Averages) cross each other occasional and I wondered if this would make a useful The code below (if it\u2019s hidden, click the \u2018show code\u2019 link at the beginning of post) shows how to calculate SMAs, identify when two time series cross each other, and calculate the returns from buying or selling depending on when the long SMA moves above the short SMA, or when short moves above\u00a0long I wanted to know which combinatio of SMA periods would yield the best results, and the heat maps show\u00a0this. Finally, for a given date range, asset pair and short and long SMA combinatio I plotted the performanc of the trading strategy with those parameters through time. This was to see if the algorithms performanc was consistent or if large losses could occur. It would also show how much upfront cost would have been needed to realise the\u00a0return Net gains w/ different SMA combinatio In\u00a0[15]: # identify where the two SMAs cross over each other # return the dates where this occurs, and label them buy or sell def sma1, sma2): if sma2 < sma1: sma1, sma2 = sma2, sma1 df = pd.DataFra df['btcPri = prices['bt df['data'] = prices[pai df['sma1'] = df['sma2'] = df['diff'] = df['sma1'] - df['sma2'] df['inGBP' = df['btcPri * df['data'] # sell ltc for btc when diff < 0 && cross == True # buy ltc with btc when diff > 0 && cross == True line = 0 df = df.dropna( df['nextDi = df['cross' = (((df['dif >= line) & < line)) | > line) & (df['diff' <= line)) | (df['diff' == line)) rows = == True] rows['trad = '...' rows['trad (rows['cro == True) & (rows['dif > 0) ] = 'sell' rows['trad (rows['cro == True) & (rows['dif < 0) ] = 'buy' # df = all the data # rows = just the rows of df where SMA1 crosses SMA2 out = {'data':df 'xOver':ro return out In\u00a0[16]: 3, In\u00a0[17]: # take the crossOver data and calculate how much would you would gain or lose between # the selected dates def returns(pa sma1, sma2, dt1, dt2 = # make sure dt1, dt2 are correctly formatted! data = sma1, sma2) # crossOver returns a dictionary with 2 items # just the rows where SMA1 crosses SMA2 # just the rows between the dates we're interested in trades = # must start with a buy. delete the first row if its a sell if != 'buy': trades = # calc the profit # make nice labels for the return dict buys = sells = buysGBP = sellsGBP = p = sells - buys pGBP = sellsGBP - buysGBP results = {'pGBP': pGBP, 'profit': p,'trades' count,'sum of buys': buys,'sum of sells': sells, 'data':tra 'pair':pai return results In\u00a0[18]: # This function calls the other two functions (defined above) # Input the asset pair, start and finish dates, and the range of SMAs to calc # Returns an ok-ish heatmap def pair, maxDays, dt1, dt2 = tbl = maxDays)) for i in for j in if j<=i: tbl[i,j] = np.NaN else: tbl[i,j] = ['proift'] #tbl trace = ) data=[trac layout = go.Layout( title='{} ) fig = layout=lay #out = out = py.iplot(f return out In\u00a0[19]: maxDays=30 maxDays=30 Out[19]: Out[19]: Returns through time for one combinatio of sma1 and sma2\u00b6 In\u00a0[20]: # This function creates a plot showing the profit through time for a given input def pair, sma1, sma2, dt1, dt2 = out = returns(pa sma1, sma2, dt1) ts = out['data' ts['data'] = == 'buy', ts['data'] * -1, ts['data'] ts['dataGB = == 'buy', ts['inGBP' * -1, ts['inGBP' ts['return = = = = #ts series1 = go.Scatter line = dict( color = ('blue'), width = 1)) series2 = go.Scatter name='av', line = dict( color = ('#137a28' width = 2)) data = [series1, series2] layout = go.Layout( title='{}: sma1 = {}d, sma2 = sma1, sma2 ), yanchor='t y=1.1, x=0.5) ) fig = layout=lay plot = py.iplot(f #plot = results = {'data': ts , 'plot':plo } return results In\u00a0[21]: sma1=8, sma2=5, pltly_name = Out[21]: In\u00a0[22]: sma1=10, sma2=6, Out[22]: Next steps:\u00b6 Create a bot to monitor real time price data, calculate the moving averages, and place\u00a0trad If you\u2019d like to collaborat with me to do this, please contact\u00a0me if { var mathjaxscr = = = = ? \"innerHTML : \"text\")] = + \" config: + \" TeX: { extensions { autoNumber 'AMS' } },\" + \" jax: + \" extensions + \" displayAli 'center',\" + \" displayInd '0em',\" + \" showMathMe true,\" + \" tex2jax: { \" + \" inlineMath [ ['$','$'] ], \" + \" displayMat [ ['$$','$$' ],\" + \" true,\" + \" preview: 'TeX',\" + \" }, \" + \" 'HTML-CSS' { \" + \" linebreaks { automatic: true, width: '95% container' }, \" + \" styles: { .MathJax .mo, .MathJax .mi, .MathJax .mn': {color: 'black ! important' }\" + \" } \" + \"}); \"; (document. || }"},{"title":"I\u2019m a chartered\u00a0accountant","category":"Non-technical/Learning","url":"chartered.html","date":"19 October 2017","tags":"icaew, aca, accounting ","body":"Earlier this year I qualified as a chartered accountant Qualificat requires passing 15 exams and gaining 3150 hours of To celebrate passing your exams and verify your success, the ICAEW print the names of everyone who passed in an advert in the Financial Times. This happened for me on the 26\u00a0January"},{"title":"Coworking in\u00a0Dublin","category":"Non-technical/Social","url":"coworking.html","date":"15 September 2017","tags":"freelance, coworking, dublin, element78, regus, officesuites, cocreate, studio9, glandore, tcube ","body":"Last week I arrived in Dublin and had two days to find a coworking space. I ran around Dublin visiting as many as\u00a0possibl I\u2019m looking to rent a hotdesk, which means I don\u2019t have any storage space at the office and I don\u2019t have a Below are my impression of eight businesses offering either hot desks or dedicated desks, ranked in order of preference It\u2019s subjective The going rate in Dublin seems to be \u20ac200 \u2014 \u20ac300 per month for a hot desk. Hours range from 9am - 7pm Monday - Friday, to\u00a024/7. First place: Dogpatch\u00a0L Dogpatch offer a coworking space and access to a community of tech startups. They run regular networking and mentoring events and occupy 3 floors of a warehouse next to an old dock. The offices have a range of interiors styles. The middle level is fairly standard but pleasant open plan offices, the top floor is a flexible working space with event areas. The lower level is a series of vaults that each contain meeting rooms of various sizes. There\u2019s a lot going on and there\u2019s a good buzz in the\u00a0air. The range of interior spaces is a big plus for me and I\u2019m looking forward to being able to switch up my working environmen through the day. There\u2019s a couple of kitchen areas, and a table football and table-tenn table on the top level. Access is 6am - midnight, and a hot desk is \u20ac200 per month. A dedicated desk is\u00a0\u20ac400. Dogpatch occupy less than a quarter of the warehouse building, the remainder of the space is used as a shopping mall and a museum. In my opinion Dogpatch is by far the best value and if you\u2019re working in tech it\u2019ll probably be your Second place: Studio9 These guys are great. The space is a lot smaller that Dogpatch, comprising only ~12 desks. It\u2019s a basement, but don\u2019t let that put you off \u2014 its open plan and has large windows at either end. The space is well designed and uses a lot of light colours and textures, including wood on the walls and floors. There are also plenty of pot plants. These combine to give a bright and airy atmosphere that feels very natural and\u00a0bright There are alcoves down one side of the main room which serve as meeting rooms, and a garden and basic kitchen at the back. The owners want the space to foster a community, and I expect that this is achieved. There are no dedicated offices and no teams using multiple desks, everyone is an The desks are large, and each has its own little wall to provide some privacy. \u20ac200 per month ongoing, or \u20ac250 for just one month. They offer a trial day for\u00a0free. Third place: OfficeSuit The building looks well run and appears clean and fresh with adequate resources. The building is an old Georgian tenement with high ceilings, cornicing and lots of natural light. The furnishing and decor lean towards corporate rather than something more interestin but this works well to add a profession feel to the homely architectu Access is\u00a024/7. There\u2019s only two rooms for hotdesking and most of the building is given to private offices. There\u2019s a garden with bike storage, meeting rooms and a quiet room for calls. Lockers are extra. They also offer a free day as a\u00a0trial. \u20ac249 p/m for desk access 9am-7pm Monday -\u00a0Friday Fourth place: Element78 Friendly and well resourced, but very corporate. The architectu is corporate glass and steel. They\u2019re situated on the ground floor in a posh business district with large financials for neighbours It\u2019s too corporate for me, but if you wanted an impressive place to meet clients and a nice address, this could be it. There were about 15 hot-desks for rent, plus plenty of dedicated desks. Clients seemed to include mostly young tech companies and I\u2019m hoping to find somewhere with more of a community, more inspiring architectu and more character. I was offered a free day as a\u00a0trial. \u20ac200 for the first month then \u20ac350\u00a0p/m Fifth place: Glandore Glandore offer a relatively luxurious package with a corporate feel. They have a few buildings in Dublin and only the flagship has hotdesking space. There\u2019s a super looking restaurant and a large club room to relax in, but these are features that I don\u2019t need, and the rates aren\u2019t competitiv if all you need is a desk and somewhere to take calls. It\u2019s set up for teams and for impressing clients, and independen tech workers are probably better off\u00a0elsewh \u20ac295\u00a0p/m Sixth place: Regus Regus offer a polished on boarding experience and the friendly staff were quick to respond and generally helpful, but the office space was bland, grey and generic. The rooms I visited didn\u2019t have external windows and it reminded me of the rooms banks often put auditors in. If grey walls and tube lights are your thing then this is for you. 24/7 access, and meeting rooms equipped with\u00a0A/V. \u20ac299.70\u00a0p/ Seventh place: CoCreate I visited the southern branch and thought the building was a bit shabby and needed a new layer of paint. I recognised the desks as some of the cheapest available from IKEA. The rooms were small and needed cleaning, and there was weird art on the\u00a0wall. The thought of paying \u20ac200 a month to sit at a small wobbly desk put me off. The place was also almost deserted. Maybe their other branch is better, but this is not for\u00a0me. \u20ac220\u00a0p/m. Last place: tCube Last and least, tCube seems to be putting in zero effort. When I visited I saw two rooms that needed painting, disorganis furniture and abandoned bits of computers lying around. The kitchen isn\u2019t high spec and the meeting room isn\u2019t big enough. I was also given a speech about how great the wifi was - a prerequisi that was taken for granted everywhere else. At \u20ac300 p/m for a dedicated desk its easy to find better"},{"title":"Bitnation","category":"Technical/Cryptocurrencies","url":"bitnation.html","date":"14 September 2017","tags":"bitnation, blockchain, disintermediation, identity, consulting, finance ","body":"I\u2019m\u00a0Consul Next week I begin working with Bitnation as their finance officer. I\u2019m excited about this because I get to work on a really ambitious project using new technology to do something innovative and\u00a0valuab Bitnation Bitnation\u2019 purpose is to offer the same services as government do, in a way that delivers more benefit to\u00a0users. In the West this may not immediatel sound like a big deal. Our government are fairly organised and the services are usually \u201cgood enough\u201d. Most significan we are not used to thinking about ID services (passports visa, drivers licences) or registrati services (land registry, marriage certificat as a service that we are customers of - like our internet service provider, or our In many parts of the world dysfunctio or unjust government represent a huge obstacle to improving everyday life, the progress and achievemen that many people can hope to realise is limited because of\u00a0this. If there was a viable alternativ to a passport from a jurisdicti renowned for forgery, or a credit rating that acknowledg your land holdings despited your government inability to maintain a credible database, then you could begin to travel, trade and enjoy the benefits that citizens of many western states take for\u00a0grante I\u2019m excited that I get to use my skills in an innovative tech company that is aiming to do something Services include secure ID systems, asset registry and dispute resolution Identifica (in particular is an area full of problems, and blockchain tech could offer some really significan improvemen Bitnation wants to create a platform where voluntary nations can be created and administer and where people can choose what jurisdicti system they are part of. If this is widely implemente it will Jurisdicti would offer their own services according to their own principles and because they are easy to create and membership is voluntaril jurisdicti would compete to attract citizens. This should lead to improvemen for the users of each service, and is intended to provide an alternativ to the slow, expensive and opaque processing methods commonly associated with services from geographic"},{"title":"Create a Multi-Signature Ethereum wallet using\u00a0Parity","category":"Technical/Cryptocurrencies","url":"ethereum-parity-multisig-wallet.html","date":"12 August 2017","tags":"ethereum, parity, blockchains, fintech, multi-sig, wallet ","body":"I recently set up a multi-sig Ethereum wallet and I couldn\u2019t find clear instructio Here they are, I hope these instructio are useful for someone looking to get\u00a0starte You\u2019ll need a way to interact with the Ethereum blockchain in order to deploy a wallet. There are several apps that you can use. I\u2019ve used Parity because I found it simple and\u00a0quick. Wallets are a type of contract and there are two types of wallet, the Multi-Sig wallet and the Watch wallet. An Ethereum account is required to communicat with a contract so if you want a multi-sig wallet with 3 signatorie (for example) then you will need to have set up at least 1 of those 3 Ethereum accounts before creating the\u00a0wallet Parity From their\u00a0webs Integrated directly into your Web browser, Parity is the fastest and most secure way of interactin with the You can do a bunch of stuff with Parity including mining Ether, manage accounts, interact with different dapps, send/recei from different accounts, and set up contracts. On the accounts tab, you can quickly set up wallets. If you use the Chrome plugin you will also get handy notificati when transactio are confirmed or Download and open\u00a0Parit For MacOS you can download and install Parity by visiting the Parity site and downloadin the installer, or from the terminal using curl or\u00a0Homebre Simple\u00a0opt $ bash <(curl -kL) Homebrew Detailed instructio are here. brew tap brew install parity --stable If you used the installer, then you open Parity opening the app and then using the logo in the\u00a0menuba If you used Brew, then start Parity with the parity and then go to the following address in your\u00a0brows You should now see something similar to\u00a0this: Add or Select the Accounts tab from the top of the page and then select \u201c+ Account\u201d. Either create new accounts or import them using your preferred method. You don\u2019t need to import all the accounts that will be part of the multi-sig wallet, but you will need to import or create the account that will own the wallet you are about to create. This account will need to have a large enough Ether balance to pay the transactio costs to deploy the multi-sig wallet onto the Blockchain The costs are tiny, but they are greater than\u00a0zero. Create the Once you\u2019ve either created or imported the account which will deploy the wallet, select \u201c+ Wallet\u201d from the accounts tab and choose \u201cMulti-Sig wallet\u201d. Click\u00a0next Enter a name for the wallet, if you want you can add a local descriptio The \u201cFrom account\u201d will be the contract owner and this account\u00a0wi Be one of Need to have enough Ether to pay for the execution of the contract on Click the \u201c+\u201d button under \u201cOther wallet owners\u201d to add the address of the other signatory accounts. You\u2019ll need to add one line for each signatory and these accounts will also own the wallet once it is\u00a0deploye In the \u201crequired owners\u201d section, specify how many accounts will need to approve a transactio that is above the daily Use the \u201cwallet day limit\u201d to set how much Ether can be spent by each account per day without needing another account to approve the transactio Set an amount of 0 if you want all transactio to require approval, or turn the option off using the slider to the right (which just specifies a huge\u00a0numbe Click \u201cnext\u201d and you\u2019ll be shown a pop-up window to approve the creation of the wallet. You will need to enter the password of the account which is creating the wallet, and once you click \u201cConfirm request\u201d the funds in the creators accounts will be used to deploy the contract on chain and create the Adding an existing Once your wallet is created and deployed, you\u2019ll need to add it to other parity clients so that the other signatorie can make or confirm transactio and view the wallets balance. This is done by adding a watch\u00a0wall Process: Accounts tab > + Wallet > Watch wallet > enter the address of the The other signatorie will now be able to view the wallet\u2019s balance, get notificati about pending confirmati and be able to make and Managing a Anyone can put funds into the wallet, just like a normal account. Just send Ether to the At the top of the page you click \u201cEdit\u201d to change the local name and descriptio of the\u00a0wallet \u201cSettings\u201d allows you add or remove owners (signatori of the wallet and change the required number of approvals and the wallet day limit. If you change these settings then the changes will need to be executed on the blockchain and the account requesting the change will therefore need to pay the required funds. Depending on the settings being changed, other accounts will need to approve the changes before they \u201cForget\u201d will remove the multi-sig wallet from your accounts\u00a0t Moving funds out of a Click on \u201cTransfer\u201d in the wallet management window (pictured above) to begin withdrawin funds from the\u00a0wallet Select the token you want to transfer - Ethereum is the only \u201cSender address\u201d - specify which account wants to withdraw the funds from the \u201cRecipient address\u201d - specify which account will receive the\u00a0funds. \u201cAmount to transfer\u201d - specify how much you want to transfer. If the amount is greater than the remaining daily limit you will get a warning bar telling you the transactio will require confirmati from other wallet\u00a0own If you want to specify the maximum transactio fee (a payment with a lower fee will be confirmed more slowly than usual) tick the \u201cadvanced sending options\u201d\u00a0b Clicking \u201csend\u201d will bring you to the confirmati stage where you can enter the password for the account which is requesting the\u00a0transf If approval from other wallet owners is required and they are also using Parity, then they can see that their approval is required in two ways: The signer tab will show there is a pending request. The wallet management window (accessed from the accounts tab) has a \u201cpending transactio section where any confirmati requests will be shown."},{"title":"Macro analysis of the Bitcoin\u00a0blockchain","category":"Technical/Data","url":"macro-btc.html","date":"3 August 2017","tags":"bitcoin, blockchain, distributed applications, finance ","body":"Table of transactio confirmati time block size (daily, MB)4\u00a0\u00a0Aver number of transactio per (1MB) fees earned by miners each day6\u00a0\u00a0Rati of transactio fees to transactio of transactio per day8\u00a0\u00a0Bitc price9\u00a0\u00a0Ra of unique addresses to between each time series In\u00a0[1]: from import HTML function code_toggl { if (code_show } else { } code_show =! code_show } $( document This analysis was made using Python. If you'd like to see the code used, click =0);v t&&t>=0);v new a(1);for(v t&&t>=0);v works only with positive a(0),mod:n a(0)};var i,o,s;retu i=new a(1),o=new a(0),s=new a(0),l=new i=new a(1),o=new f;return A[t];var p;else m;else new Error(\"Unk prime \"+t);e=new g}return works only with works only with red works only with works only with red l}}}functi s(t){retur l(t,e){ret 1:return s(t);case 3:return new Invalid n(t,e,r){v \"+r);var new n(t,e){var i=\"for(var i=n[t];ret var P=C+1;PZ)t new typed array length\");v e=new e)throw new Error(\"If encoding is specified then the first argument must be a l(t)}retur t)throw new argument must not be a t instanceof t)throw new argument must be a new to allocate Buffer larger than maximum size: bytes\");re 0|t}functi instanceof 0;for(var void 0:return v(t,e,r){v n=!1;if((v new hex E(n)}funct E(t){var new to access beyond buffer new argument must be a Buffer new out of new out of a}function U(t){for(v a}function H(t){retur i}function Y(t){retur t!==t}var t=new browser lacks typed array (Uint8Arra support which is required by `buffer` v5.x. Use `buffer` v4.x if you require old browser new must be 0;for(var ... new must be a new of range 0;for(var 0)}var new to write outside buffer new encoding: new out of new Error(\"Inv string. Length must be a multiple of i(t){retur a(t){var o(t){retur i(t,e){ret a(t,e){for i(g,d,v,h) n(t){var i(t,e){for r=new s}function m(t,e,r){v i=new n(t){var strict\";va t;var new Error(f+\" map requires nshades to be at least size i(t,e,r,i) 0}return n(t,e){ret t-e}functi i(t,e){var 0:return 0;case 1:return t[0]-e[0]; 2:return 3:var i;var 4:var n(t){var t}function i(t){retur a(t){retur o(t){retur null}var null;var a}return a}return i(t){var e=new i=0;i0)thr new Error(\"cwi pre() block may not reference array new Error(\"cwi post() block may not reference array args\")}els new Error(\"cwi pre() block may not reference array new Error(\"cwi post() block may not reference array index\")}el new Error(\"cwi Too many arguments in pre() new Error(\"cwi Too many arguments in body() new Error(\"cwi Too many arguments in post() block\");re n(t,e,r){v l(t,e){for w=new cwise routine for new return n(t){var e=[\"'use strict'\",\" function (!(\"+l.joi && \")+\")) throw new Error('cwi Arrays do not all have the same {\"),e.push (!(\"+u.joi && \")+\")) throw new Error('cwi Arrays do not all have the same i(t,e,r){v m,v=new t;var e=[];for(v r in e=[];for(v r in e=[];for(v r in n&&void e(t,e){var n in r}function r(){}funct n(t){var e;return y(t){retur b(t){retur new Error(\"unk type: i(t,e){for r,n,i=new i(){if(s){ t(t){var r(t){var Array(o),l this;var 1:do{o=new 2:do{o=new 3:do{o=new t=[];retur s(){var l(){for(va a(t){retur n}}}functi l(t){retur u(t){for(v e}function c(t,e){for r in p(t){retur f(t)in this._&&de v(){var t=[];for(v e in t}function g(){var t=0;for(va e in t}function y(){for(va t in x(t){retur t}function function() w(t,e){if( in t)return Z(t,e){ret J(t,e){var K(t){var vt(t){retu gt(t){retu yt(t){retu _t(t){retu wt(t){retu kt(t,e,r){ Ot(){for(v t}function Ft(){for(v Nt(t){var b=u&&h;ret Bt(t){retu t+\"\"}funct n(e){var in e}function Gt(t,e,r){ le(t){var ce(t){for( ge(t){var ye(t,e){re we(t){var ke(t,e){re y}}functio Fe(t){retu Re(){var r=e;return r(t){var a(t,e){ret o(t,e){var l(t){for(v c(i,a){ret Je(){funct s}function 0 1,1 0 1,1 $e(){funct t(t,n){var er(){funct t(t,e){var rr(t){func s}function nr(t){func r(e){retur n(e){funct a(r,n){var k}function ir(t){var sr(t){retu t})()}func lr(t){func e(t){retur i(){return ur(t){retu r(t,e){var Vr(t,e){va n;var s;var Hr(t,e){va Vr(r,e);va Gr(t){for( r}function wn(t,e){va kn(t){retu An(t){retu 1;var Zn(t,e){va n}function bi(t){retu xi(t,e){re _i(t,e){re Ei(t){func Ni(t){retu t.y})}func Bi(t){retu Ui(t){var Vi(t){var qi(t,e){va a(t){retur o(t)}var o,s;return Qi(t,e){re $i(t,e){re a(t){retur o(e){retur t(i(e))}re _a(t){func e(e){funct Ma(t){retu ka(t){for( Aa(t){for( p[n]:delet t[r],1}var io(t){retu n(e){retur t(e)}funct i(t,r){var r};var t;var e=new b;if(t)for h(){functi f(){functi t(){var in r(){var n(){var o;if(i)ret i=!1,a;var e=new ms={\"-\":\"\" %b %e %X this.s}};v bs=new e(e,r){var t(){var e(){return }var new t(e,r,n,i) c}function e(t){for(v r}function t(t,a){var t(t,e){for i(t,e,r,n) \"+e}functi 0,0 \"+n}var t(t,i){var \"+l[2]+\" \"+l[3]}var 0,\"+e+\" \"+e+\",\"+e+ a(){functi v(){var l;var t;e||(e=t) e}function s(t){var l(t,e,r,n) u(t,e,r){v n=t;do{var n}function l=t;do{for h(t,e,r,n) r}function m(t,e,r,n) v(t){var t}function x(t,e){ret w(t,e){ret k(t,e){var A(t,e){ret n(t,e){var e){e=0;for warning: possible EventEmitt memory leak detected. %d listeners added. Use to increase must be a function\") n=!1;retur must be a e=typeof o(t,e,r,n) \"+i+\"=== typeof s(t,e){ret e.length> 1; if (a[m] === v) return true; if (a[m] > v) j = m - 1; else i = m + 1;}return false; }(\"+n+\", u(t){retur in p\"}functio h(t,e){ret c[1]){var s[e][t];va i(t){retur new a(t,e){ret new r}var 0)}functio d(t){for(v m(t){retur new t=[];retur t=[];retur 1:return 2:return new new new new this.tree; e=new i=0;i0)ret new Error(\"Can update empty node!\");va r=new new s(t){for(v z%d-%d-%d (features: %d, points: %d, simplified down to parent tile down\");var i(t,e,r){v s}function i(t,e,r,n) s(t,e){var r=new i(t);retur e(e,r,n){i in t){var U=g,V=_,k= 0.0) {\\n vec3 nPosition = mix(bounds bounds[1], 0.5 * (position + 1.0));\\n gl_Positio = projection * view * model * 1.0);\\n } else {\\n gl_Positio = }\\n colorChann = mediump GLSLIFY 1\\n\\nunifo vec4 vec3 main() {\\n gl_FragCol = colorChann * colors[0] + \\n colorChann * colors[1] +\\n colorChann * vectorizin d=new o(t,e,r,n) s;var r}function a(t,e){for r=0;rr)thr new If resizing buffer, must not specify a(t,e){for new Invalid type for webgl buffer, must be either or new Invalid usage for buffer, must be either gl.STATIC_ or t&&void new Cannot specify offset when resizing new Error(\"gl- Can't resize FBO, invalid new Error(\"gl- Parameters are too large for new Error(\"gl- Multiple draw buffer extension not new Error(\"gl- Context does not support \"+s+\" draw buffers\")} new Error(\"gl- Context does not support floating point h=!0;\"dept new Error(\"gl- Shape vector must be length 2\");var null;var 0.25) {\\n discard;\\n }\\n gl_FragCol = highp GLSLIFY 1\\n\\nattri vec2 aHi, aLo, vec4 pick0, vec2 scaleHi, translateH scaleLo, translateL float vec4 pickA, scHi, vec2 trHi, vec2 scLo, vec2 trLo, vec2 posHi, vec2 posLo) {\\n return (posHi + trHi) * scHi\\n + (posLo + trLo) * scHi\\n + (posHi + trHi) * scLo\\n + (posLo + trLo) * main() {\\n vec2 p = translateH scaleLo, translateL aHi, aLo);\\n vec2 n = width * * vec2(dHi.y -dHi.x)) / gl_Positio = vec4(p + n, 0, 1);\\n pickA = pick0;\\n pickB = mediump GLSLIFY 1\\n\\nunifo vec4 vec4 pickA, pickB;\\n\\n main() {\\n vec4 fragId = 0.0);\\n if(pickB.w > pickA.w) {\\n fragId.xyz = pickB.xyz; }\\n\\n fragId += fragId.y += floor(frag / 256.0);\\n fragId.x -= floor(frag / 256.0) * 256.0;\\n\\n fragId.z += floor(frag / 256.0);\\n fragId.y -= floor(frag / 256.0) * 256.0;\\n\\n fragId.w += floor(frag / 256.0);\\n fragId.z -= floor(frag / 256.0) * 256.0;\\n\\n gl_FragCol = fragId / highp GLSLIFY 1\\n\\nattri vec2 aHi, aLo, vec2 scaleHi, translateH scaleLo, translateL float projectVal scHi, vec2 trHi, vec2 scLo, vec2 trLo, vec2 posHi, vec2 posLo) {\\n return (posHi + trHi) * scHi\\n + (posLo + trLo) * scHi\\n + (posHi + trHi) * scLo\\n + (posLo + trLo) * main() {\\n vec2 p = translateH scaleLo, translateL aHi, aLo);\\n if(dHi.y e+n;var null;var FLOAT_MAX) {\\n return vec4(127.0 128.0, 0.0, 0.0) / 255.0;\\n } else if(v \"+t[1]+\", \"+t[2]+\", t=new e=new r=new \"+t[1]+\", n=\"precisi mediump GLSLIFY 1\\n\\nunifo vec3 float vec3 vec4 f_id;\\n\\nv main() {\\n || \\n {\\n discard;\\n }\\n gl_FragCol = vec4(pickI mediump GLSLIFY 1\\n\\nattri vec3 position, vec4 vec2 uv;\\n\\nuni mat4 model\\n , view\\n , vec3 eyePositio , vec3 f_normal\\n , , , vec4 vec2 f_uv;\\n\\nv main() {\\n vec4 m_position = model * vec4(posit 1.0);\\n vec4 t_position = view * m_position gl_Positio = projection * t_position f_color = color;\\n f_normal = normal;\\n f_data = position;\\ f_eyeDirec = eyePositio - position;\\ = lightPosit - position;\\ f_uv = mediump GLSLIFY 1\\n\\nfloat x, float roughness) {\\n float NdotH = max(x, 0.0001);\\n float cos2Alpha = NdotH * NdotH;\\n float tan2Alpha = (cos2Alpha - 1.0) / cos2Alpha; float roughness2 = roughness * roughness; float denom = * roughness2 * cos2Alpha * cos2Alpha; return exp(tan2Al / roughness2 / vec3 vec3 vec3 float roughness, float fresnel) {\\n\\n float VdotN = 0.0);\\n float LdotN = 0.0);\\n\\n //Half angle vector\\n vec3 H = + //Geometri term\\n float NdotH = H), 0.0);\\n float VdotH = H), 0.000001); float LdotH = H), 0.000001); float G1 = (2.0 * NdotH * VdotN) / VdotH;\\n float G2 = (2.0 * NdotH * LdotN) / LdotH;\\n float G = min(1.0, min(G1, G2));\\n \\n //Distribu term\\n float D = //Fresnel term\\n float F = pow(1.0 - VdotN, fresnel);\\ //Multiply terms and done\\n return G * F * D / max(3.1415 * VdotN, vec3 float roughness\\ , fresnel\\n , kambient\\n , kdiffuse\\n , kspecular\\ , sampler2D vec3 f_normal\\n , , , vec4 vec2 f_uv;\\n\\nv main() {\\n || \\n {\\n discard;\\n }\\n\\n vec3 N = vec3 L = vec3 V = \\n {\\n N = -N;\\n }\\n\\n float specular = V, N, roughness, fresnel);\\ float diffuse = min(kambie + kdiffuse * max(dot(N, L), 0.0), 1.0);\\n\\n vec4 surfaceCol = f_color * f_uv);\\n vec4 litColor = surfaceCol * vec4(diffu * + kspecular * vec3(1,1,1 * specular, 1.0);\\n\\n gl_FragCol = litColor * mediump GLSLIFY 1\\n\\nattri vec3 vec4 vec2 uv;\\n\\nuni mat4 model, view, vec4 vec3 vec2 f_uv;\\n\\nv main() {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n f_color = color;\\n f_data = position;\\ f_uv = mediump GLSLIFY 1\\n\\nunifo vec3 sampler2D float vec4 vec3 vec2 f_uv;\\n\\nv main() {\\n || \\n {\\n discard;\\n }\\n\\n gl_FragCol = f_color * f_uv) * mediump GLSLIFY 1\\n\\nattri vec3 vec4 vec2 uv;\\nattri float mat4 model, view, vec3 vec4 vec2 f_uv;\\n\\nv main() {\\n || \\n {\\n gl_Positio = } else {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n }\\n gl_PointSi = pointSize; f_color = color;\\n f_uv = mediump GLSLIFY 1\\n\\nunifo sampler2D float vec4 vec2 f_uv;\\n\\nv main() {\\n vec2 pointR = - if(dot(poi pointR) > 0.25) {\\n discard;\\n }\\n gl_FragCol = f_color * f_uv) * mediump GLSLIFY 1\\n\\nattri vec3 vec4 id;\\n\\nuni mat4 model, view, vec3 vec4 f_id;\\n\\nv main() {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n f_id = id;\\n f_position = mediump GLSLIFY 1\\n\\nattri vec3 float vec4 id;\\n\\nuni mat4 model, view, vec3 vec3 vec4 f_id;\\n\\nv main() {\\n || \\n {\\n gl_Positio = } else {\\n gl_Positio = projection * view * model * vec4(posit 1.0);\\n gl_PointSi = pointSize; }\\n f_id = id;\\n f_position = mediump GLSLIFY 1\\n\\nattri vec3 mat4 model, view, main() {\\n gl_Positio = projection * view * model * vec4(posit mediump GLSLIFY 1\\n\\nunifo vec3 main() {\\n gl_FragCol = i(t){for(v null;for(v function() E=new new s(\"\",\"Inva data type for attribute \"+h+\": new s(\"\",\"Unkn data type for attribute \"+h+\": \"+f);var new s(\"\",\"Inva data type for attribute \"+h+\": n(t){retur new i(t,e){for r=new new s(\"\",\"Inva uniform dimension type for matrix \"+name+\": new s(\"\",\"Unkn uniform data type for \"+name+\": \"+r)}var new s(\"\",\"Inva data new data type for vector \"+name+\": r=[];for(v n in e){var r}function h(e){for(v n=[\"return function new s(\"\",\"Inva data new s(\"\",\"Inva uniform dimension type for matrix \"+name+\": \"+t);retur i(r*r,0)}t new s(\"\",\"Unkn uniform data type for \"+name+\": \"+t)}}func i){var p(t){var r=0;r1){l[ u=1;u1)for l=0;l=0){v t||t}funct s(t){funct r(){for(va u=0;u 1.0) {\\n discard;\\n }\\n baseColor = color, step(radiu gl_FragCol = * baseColor. mediump GLSLIFY 1\\n\\nattri vec2 vec4 mat3 float vec4 vec4 main() {\\n vec3 hgPosition = matrix * vec3(posit 1);\\n gl_Positio = 0, gl_PointSi = pointSize; vec4 id = pickId + pickOffset id.y += floor(id.x / 256.0);\\n id.x -= floor(id.x / 256.0) * 256.0;\\n\\n id.z += floor(id.y / 256.0);\\n id.y -= floor(id.y / 256.0) * 256.0;\\n\\n id.w += floor(id.z / 256.0);\\n id.z -= floor(id.z / 256.0) * 256.0;\\n\\n fragId = mediump GLSLIFY 1\\n\\nvaryi vec4 main() {\\n float radius = length(2.0 * - 1.0);\\n if(radius > 1.0) {\\n discard;\\n }\\n gl_FragCol = fragId / i(t,e){var instanceof instanceof null;var n(t,e,r,n) highp GLSLIFY 1\\n\\n\\nvec posHi, vec2 posLo, vec2 scHi, vec2 scLo, vec2 trHi, vec2 trLo) {\\n return vec4((posH + trHi) * scHi\\n \\t\\t\\t//FI this thingy does not give noticeable precision gain, need test\\n + (posLo + trLo) * scHi\\n + (posHi + trHi) * scLo\\n + (posLo + trLo) * scLo\\n , 0, vec2 positionHi float size, vec2 char, is 64-bit form of scale and vec2 scaleHi, scaleLo, translateH float vec4 sampler2D vec4 charColor, vec2 vec2 float float main() {\\n charColor = vec2(color / 255., 0));\\n borderColo = vec2(color / 255., 0));\\n\\n gl_PointSi = size * pixelRatio pointSize = size * charId = char;\\n borderWidt = border;\\n\\ gl_Positio = positionHi positionLo scaleHi, scaleLo,\\n translateH pointCoord = viewBox.xy + (viewBox.z - viewBox.xy * * .5 + highp GLSLIFY 1\\n\\nunifo sampler2D vec2 float charsStep, pixelRatio vec4 vec4 vec2 vec2 float float main() {\\n\\tvec2 pointUV = (pointCoor - + pointSize * .5) / = 1. - texCoord = ((charId + pointUV) * charsStep) / dist = alpha\\n\\ti (dist t;){var w.push(new i(){var a(t,e){var e=void null;var number of characters is more than maximum texture size. Try reducing x=0;x 1.0) {\\n discard;\\n }\\n vec4 baseColor = color, float alpha = 1.0 - pow(1.0 - baseColor. fragWeight gl_FragCol = * alpha, highp GLSLIFY 1\\n\\nvec4 pfx_1_0(ve scaleHi, vec2 scaleLo, vec2 translateH vec2 translateL vec2 positionHi vec2 positionLo {\\n return + translateH * scaleHi\\n + (positionL + translateL * scaleHi\\n + (positionH + translateH * scaleLo\\n + (positionL + translateL * scaleLo, 0.0, vec2 positionHi vec4 vec2 scaleHi, scaleLo, translateH float vec4 vec4 main() {\\n\\n vec4 id = pickId + pickOffset id.y += floor(id.x / 256.0);\\n id.x -= floor(id.x / 256.0) * 256.0;\\n\\n id.z += floor(id.y / 256.0);\\n id.y -= floor(id.y / 256.0) * 256.0;\\n\\n id.w += floor(id.z / 256.0);\\n id.z -= floor(id.z / 256.0) * 256.0;\\n\\n gl_Positio = scaleLo, translateH translateL positionHi positionLo gl_PointSi = pointSize; fragId = mediump GLSLIFY 1\\n\\nvaryi vec4 main() {\\n float radius = length(2.0 * - 1.0);\\n if(radius > 1.0) {\\n discard;\\n }\\n gl_FragCol = fragId / i(t,e){var e(e,r){ret e in n(t,e){var in r)return r[t];for(v o=r.gl d(t){var null;var a(t,e){ret new E=new i(t,e){var r=new n(t);retur 0.0 ||\\n || {\\n discard;\\n }\\n\\n vec3 N = vec3 V = vec3 L = {\\n N = -N;\\n }\\n\\n float specular = V, N, roughness) float diffuse = min(kambie + kdiffuse * max(dot(N, L), 0.0), 1.0);\\n\\n //decide how to interpolat color \\u2014 in vertex or in fragment\\n vec4 surfaceCol = .5) * vec2(value value)) + step(.5, vertexColo * vColor;\\n\\ vec4 litColor = surfaceCol * vec4(diffu * + kspecular * vec3(1,1,1 * specular, 1.0);\\n\\n gl_FragCol = mix(litCol contourCol contourTin * mediump GLSLIFY 1\\n\\nattri vec4 uv;\\nattri float f;\\n\\nunif mat3 mat4 model, view, float height, sampler2D float value, kill;\\nvar vec3 vec2 vec3 eyeDirecti vec4 main() {\\n vec3 dataCoordi = permutatio * vec3(uv.xy height);\\n vec4 worldPosit = model * 1.0);\\n\\n vec4 clipPositi = projection * view * clipPositi = clipPositi + zOffset;\\n gl_Positio = value = f;\\n kill = -1.0;\\n = = uv.zw;\\n\\n vColor = vec2(value value));\\n //Don't do lighting for contours\\n surfaceNor = vec3(1,0,0 eyeDirecti = vec3(0,1,0 lightDirec = mediump GLSLIFY 1\\n\\nunifo vec2 vec3 float float value, kill;\\nvar vec3 vec2 vec3 v) {\\n float vh = 255.0 * v;\\n float upper = floor(vh); float lower = fract(vh); return vec2(upper / 255.0, floor(lowe * 16.0) / main() {\\n if(kill > 0.0 ||\\n || {\\n discard;\\n }\\n vec2 ux = / shape.x);\\ vec2 uy = / shape.y);\\ gl_FragCol = vec4(pickI ux.x, uy.x, ux.y + i(t){var o(t,e){var new invalid coordinate for new Invalid texture size\");ret s(t,e){ret new Invalid ndarray, must be 2d or 3d\");var new Invalid shape for new Invalid shape for pixel new Incompatib texture format for new Invalid texture new Floating point textures not supported on this platform\") s=u(t);ret s=u(t);ret f(t,e){var new Invalid texture size\");var new Invalid shape for new Invalid shape for pixel b=u(t);ret new Error(\"gl- Too many vertex n(t,e,r){v i=new n(t){for(v n(t,e){var n(t,e,r){v instanceof a=new a(t,e){ret o(t){for(v e=[\"functi orient(){v orient\");v n=new a(t,e){var o(t,e){var s(t,e){var i}}functio c(t,e){for s(this,t); s(this,t); b}for(var r}return n}return l}function i(t,e,r,n) n(t,e){var r;if(h(t)) new Error('Unk function type -1 and 1 => 1\\n // In the texture normal, x is 0 if the normal points straight up/down and 1 if it's a round cap\\n // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = mod(a_pos, 2.0);\\n normal.y = sign(norma - 0.5);\\n v_normal = normal;\\n\\ float inset = u_gapwidth + (u_gapwidt > 0.0 ? u_antialia : 0.0);\\n float outset = u_gapwidth + u_linewidt * (u_gapwidt > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset = u_offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n // Remove the texture normal bit of the position before scaling it with the\\n // model/view matrix.\\n gl_Positio = u_matrix * * 0.5) + (offset + dist) / u_ratio, 0.0, 1.0);\\n\\n // position of y on the screen\\n float y = gl_Positio / // how much features are squished in the y direction by the tilt\\n float squish_sca = / * // how much features are squished in all directions by the float = 1.0 / (1.0 - min(y * u_extra, 0.9));\\n\\n v_linewidt = vec2(outse inset);\\n v_gamma_sc = * mediump lowp\\n#def float vec2 vec2 vec2 vec2 vec2 vec2 float float sampler2D vec2 vec2 float float main() {\\n // Calculate the distance of the pixel from the line in pixels.\\n float dist = * // Calculate the antialiasi fade factor. This is either when fading in\\n // the line in case of an offset line or when fading out\\n // float blur = u_blur * float alpha = clamp(min( - (v_linewid - blur), v_linewidt - dist) / blur, 0.0, 1.0);\\n\\n float x_a = / 1.0);\\n float x_b = / 1.0);\\n float y_a = 0.5 + (v_normal. * v_linewidt / float y_b = 0.5 + (v_normal. * v_linewidt / vec2 pos_a = vec2(x_a, y_a));\\n vec2 pos_b = vec2(x_b, y_b));\\n\\n vec4 color = pos_a), pos_b), u_fade);\\n alpha *= u_opacity; gl_FragCol = color * gl_FragCol = highp lowp\\n#def floor(127 / 2) == 63.0\\n// the maximum allowed miter limit is 2.0 at the moment. the extrude normal is\\n// stored in a byte (-128..127 we scale regular normals up to length 63, but\\n// there are also \\\"special\\ normals that have a bigger length (of up to 126 in\\n// this case).\\n// #define scale 63.0\\n#def scale We scale the distance before adding it to the buffers so that we can store\\n// long distances for long segments. Use this value to unscale the vec2 vec4 mat4 mediump float mediump float mediump float mediump float mediump float mat2 mediump float vec2 vec2 float float main() {\\n vec2 a_extrude = a_data.xy - 128.0;\\n float a_directio = mod(a_data 4.0) - 1.0;\\n float a_linesofa = / 4.0) + a_data.w * 64.0) * // We store the texture normals in the most insignific bit\\n // transform y so that 0 => -1 and 1 => 1\\n // In the texture normal, x is 0 if the normal points straight up/down and 1 if it's a round cap\\n // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = mod(a_pos, 2.0);\\n normal.y = sign(norma - 0.5);\\n v_normal = normal;\\n\\ float inset = u_gapwidth + (u_gapwidt > 0.0 ? u_antialia : 0.0);\\n float outset = u_gapwidth + u_linewidt * (u_gapwidt > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset = u_offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n // Remove the texture normal bit of the position before scaling it with the\\n // model/view matrix.\\n gl_Positio = u_matrix * * 0.5) + (offset + dist) / u_ratio, 0.0, 1.0);\\n v_linesofa = // position of y on the screen\\n float y = gl_Positio / // how much features are squished in the y direction by the tilt\\n float squish_sca = / * // how much features are squished in all directions by the float = 1.0 / (1.0 - min(y * u_extra, 0.9));\\n\\n v_linewidt = vec2(outse inset);\\n v_gamma_sc = * mediump lowp\\n#def lowp vec4 lowp float float sampler2D float float vec2 vec2 vec2 vec2 float main() {\\n // Calculate the distance of the pixel from the line in pixels.\\n float dist = * // Calculate the antialiasi fade factor. This is either when fading in\\n // the line in case of an offset line or when fading out\\n // float blur = u_blur * float alpha = clamp(min( - (v_linewid - blur), v_linewidt - dist) / blur, 0.0, 1.0);\\n\\n float sdfdist_a = v_tex_a).a float sdfdist_b = v_tex_b).a float sdfdist = mix(sdfdis sdfdist_b, u_mix);\\n alpha *= smoothstep - u_sdfgamma 0.5 + u_sdfgamma sdfdist);\\ gl_FragCol = u_color * (alpha * gl_FragCol = highp lowp\\n#def floor(127 / 2) == 63.0\\n// the maximum allowed miter limit is 2.0 at the moment. the extrude normal is\\n// stored in a byte (-128..127 we scale regular normals up to length 63, but\\n// there are also \\\"special\\ normals that have a bigger length (of up to 126 in\\n// this case).\\n// #define scale 63.0\\n#def scale We scale the distance before adding it to the buffers so that we can store\\n// long distances for long segments. Use this value to unscale the vec2 vec4 mat4 mediump float mediump float mediump float mediump float vec2 float vec2 float float mat2 mediump float vec2 vec2 vec2 vec2 float main() {\\n vec2 a_extrude = a_data.xy - 128.0;\\n float a_directio = mod(a_data 4.0) - 1.0;\\n float a_linesofa = / 4.0) + a_data.w * 64.0) * // We store the texture normals in the most insignific bit\\n // transform y so that 0 => -1 and 1 => 1\\n // In the texture normal, x is 0 if the normal points straight up/down and 1 if it's a round cap\\n // y is 1 if the normal points up, and -1 if it points down\\n mediump vec2 normal = mod(a_pos, 2.0);\\n normal.y = sign(norma - 0.5);\\n v_normal = normal;\\n\\ float inset = u_gapwidth + (u_gapwidt > 0.0 ? u_antialia : 0.0);\\n float outset = u_gapwidth + u_linewidt * (u_gapwidt > 0.0 ? 2.0 : 1.0) + // Scale the extrusion vector down to a normal and then up by the line width\\n // of this vertex.\\n mediump vec2 dist = outset * a_extrude * scale;\\n\\n // Calculate the offset when drawing a line that is to the side of the actual line.\\n // We do this by creating a vector that points towards the extrude, but rotate\\n // it when we're drawing round end points (a_directi = -1 or 1) since their\\n // extrude vector points in another direction. mediump float u = 0.5 * a_directio mediump float t = 1.0 - abs(u);\\n mediump vec2 offset = u_offset * a_extrude * scale * normal.y * mat2(t, -u, u, t);\\n\\n // Remove the texture normal bit of the position before scaling it with the\\n // model/view matrix.\\n gl_Positio = u_matrix * * 0.5) + (offset + dist) / u_ratio, 0.0, 1.0);\\n\\n v_tex_a = * normal.y * + u_tex_y_a) v_tex_b = * normal.y * + // position of y on the screen\\n float y = gl_Positio / // how much features are squished in the y direction by the tilt\\n float squish_sca = / * // how much features are squished in all directions by the float = 1.0 / (1.0 - min(y * u_extra, 0.9));\\n\\n v_linewidt = vec2(outse inset);\\n v_gamma_sc = * mediump lowp\\n#def mapbox: define lowp vec4 mapbox: define lowp float vec2 v_pos;\\n\\n main() {\\n #pragma mapbox: initialize lowp vec4 #pragma mapbox: initialize lowp float opacity\\n\\ float dist = length(v_p - float alpha = 0.0, dist);\\n gl_FragCol = outline_co * (alpha * gl_FragCol = highp lowp\\n#def vec2 mat4 vec2 vec2 mapbox: define lowp vec4 mapbox: define lowp float main() {\\n #pragma mapbox: initialize lowp vec4 #pragma mapbox: initialize lowp float opacity\\n\\ gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n v_pos = / gl_Positio + 1.0) / 2.0 * mediump lowp\\n#def float vec2 vec2 vec2 vec2 float sampler2D vec2 vec2 vec2 v_pos;\\n\\n main() {\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos = imagecoord vec4 color1 = pos);\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos2 = vec4 color2 = pos2);\\n\\n // find distance to outline for alpha float dist = length(v_p - float alpha = 0.0, dist);\\n \\n\\n gl_FragCol = mix(color1 color2, u_mix) * alpha * gl_FragCol = highp lowp\\n#def vec2 vec2 vec2 vec2 float float float vec2 mat4 vec2 vec2 vec2 vec2 v_pos;\\n\\n main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n vec2 scaled_siz = u_scale_a * vec2 scaled_siz = u_scale_b * // the correct offset needs to be calculated //\\n // The offset depends on how many pixels are between the world origin and\\n // the edge of the tile:\\n // vec2 offset = size)\\n //\\n // At high zoom levels there are a ton of pixels between the world origin\\n // and the edge of the tile. The glsl spec only guarantees 16 bits of\\n // precision for highp floats. We need more than that.\\n //\\n // The pixel_coor is passed in as two 16 bit values:\\n // = / 2^16)\\n // = 2^16)\\n //\\n // The offset is calculated in a series of steps that should preserve this precision: vec2 offset_a = scaled_siz * 256.0, scaled_siz * 256.0 + vec2 offset_b = scaled_siz * 256.0, scaled_siz * 256.0 + v_pos_a = * a_pos + offset_a) / v_pos_b = * a_pos + offset_b) / v_pos = / gl_Positio + 1.0) / 2.0 * mediump lowp\\n#def float vec2 vec2 vec2 vec2 float sampler2D vec2 vec2 main() {\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos = imagecoord vec4 color1 = pos);\\n\\n vec2 imagecoord = mod(v_pos_ 1.0);\\n vec2 pos2 = vec4 color2 = pos2);\\n\\n gl_FragCol = mix(color1 color2, u_mix) * gl_FragCol = highp lowp\\n#def mat4 vec2 vec2 vec2 vec2 float float float vec2 vec2 vec2 main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n vec2 scaled_siz = u_scale_a * vec2 scaled_siz = u_scale_b * // the correct offset needs to be calculated //\\n // The offset depends on how many pixels are between the world origin and\\n // the edge of the tile:\\n // vec2 offset = size)\\n //\\n // At high zoom levels there are a ton of pixels between the world origin\\n // and the edge of the tile. The glsl spec only guarantees 16 bits of\\n // precision for highp floats. We need more than that.\\n //\\n // The pixel_coor is passed in as two 16 bit values:\\n // = / 2^16)\\n // = 2^16)\\n //\\n // The offset is calculated in a series of steps that should preserve this precision: vec2 offset_a = scaled_siz * 256.0, scaled_siz * 256.0 + vec2 offset_b = scaled_siz * 256.0, scaled_siz * 256.0 + v_pos_a = * a_pos + offset_a) / v_pos_b = * a_pos + offset_b) / mediump lowp\\n#def float float sampler2D sampler2D vec2 vec2 float float float float vec3 main() {\\n\\n // read and cross-fade colors from the main and parent tiles\\n vec4 color0 = v_pos0);\\n vec4 color1 = v_pos1);\\n vec4 color = color0 * u_opacity0 + color1 * u_opacity1 vec3 rgb = color.rgb; // spin\\n rgb = vec3(\\n dot(rgb, dot(rgb, dot(rgb, // saturation float average = (color.r + color.g + color.b) / 3.0;\\n rgb += (average - rgb) * // contrast\\n rgb = (rgb - 0.5) * + 0.5;\\n\\n // brightness vec3 u_high_vec = vec3 u_low_vec = gl_FragCol = u_low_vec, rgb), gl_FragCol = highp lowp\\n#def mat4 vec2 float float vec2 vec2 vec2 vec2 main() {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1);\\n v_pos0 = / 32767.0) - 0.5) / u_buffer_s ) + 0.5;\\n v_pos1 = (v_pos0 * + mediump lowp\\n#def sampler2D sampler2D lowp float vec2 vec2 main() {\\n lowp float alpha = v_fade_tex * u_opacity; gl_FragCol = v_tex) * gl_FragCol = highp lowp\\n#def vec2 vec2 vec2 vec4 matrix is for the vertex mat4 mediump float bool vec2 vec2 vec2 vec2 main() {\\n vec2 a_tex = mediump float a_labelmin = a_data[0]; mediump vec2 a_zoom = a_data.pq; mediump float a_minzoom = a_zoom[0]; mediump float a_maxzoom = a_zoom[1]; // u_zoom is the current zoom level adjusted for the change in font size\\n mediump float z = 2.0 - u_zoom) - (1.0 - u_zoom));\\ vec2 extrude = * (a_offset / 64.0);\\n if {\\n gl_Positio = u_matrix * vec4(a_pos + extrude, 0, 1);\\n gl_Positio += z * } else {\\n gl_Positio = u_matrix * vec4(a_pos 0, 1) + vec4(extru 0, 0);\\n }\\n\\n v_tex = a_tex / u_texsize; v_fade_tex = / 255.0, mediump lowp\\n#def sampler2D sampler2D lowp vec4 lowp float lowp float lowp float vec2 vec2 float main() {\\n lowp float dist = v_tex).a;\\ lowp float fade_alpha = lowp float gamma = u_gamma * lowp float alpha = - gamma, u_buffer + gamma, dist) * gl_FragCol = u_color * (alpha * gl_FragCol = highp lowp\\n#def float PI = vec2 vec2 vec2 vec4 matrix is for the vertex mat4 mediump float bool bool mediump float mediump float mediump float vec2 vec2 vec2 vec2 float main() {\\n vec2 a_tex = mediump float a_labelmin = a_data[0]; mediump vec2 a_zoom = a_data.pq; mediump float a_minzoom = a_zoom[0]; mediump float a_maxzoom = a_zoom[1]; // u_zoom is the current zoom level adjusted for the change in font size\\n mediump float z = 2.0 - u_zoom) - (1.0 - u_zoom));\\ // map\\n // map | viewport\\n if {\\n lowp float angle = ? (a_data[1] / 256.0 * 2.0 * PI) : u_bearing; lowp float asin = sin(angle) lowp float acos = cos(angle) mat2 RotationMa = mat2(acos, asin, -1.0 * asin, acos);\\n vec2 offset = RotationMa * a_offset;\\ vec2 extrude = * (offset / 64.0);\\n gl_Positio = u_matrix * vec4(a_pos + extrude, 0, 1);\\n gl_Positio += z * // viewport\\n // map\\n } else if {\\n // foreshorte factor to apply on pitched maps\\n // as a label goes from horizontal vertical in angle\\n // it goes from 0% foreshorte to up to around 70% lowp float pitchfacto = 1.0 - cos(u_pitc * sin(u_pitc * 0.75));\\n\\ lowp float lineangle = a_data[1] / 256.0 * 2.0 * PI;\\n\\n // use the lineangle to position points a,b along the line\\n // project the points and calculate the label angle in projected space\\n // this calculatio allows labels to be rendered unskewed on pitched maps\\n vec4 a = u_matrix * vec4(a_pos 0, 1);\\n vec4 b = u_matrix * vec4(a_pos + 0, 1);\\n lowp float angle = - b[0]/b[3] - a[0]/a[3]) lowp float asin = sin(angle) lowp float acos = cos(angle) mat2 RotationMa = mat2(acos, -1.0 * asin, asin, acos);\\n\\n vec2 offset = RotationMa * 1.0) * a_offset); vec2 extrude = * (offset / 64.0);\\n gl_Positio = u_matrix * vec4(a_pos 0, 1) + vec4(extru 0, 0);\\n gl_Positio += z * // viewport\\n // viewport\\n } else {\\n vec2 extrude = * (a_offset / 64.0);\\n gl_Positio = u_matrix * vec4(a_pos 0, 1) + vec4(extru 0, 0);\\n }\\n\\n v_gamma_sc = (gl_Positi - 0.5);\\n\\n v_tex = a_tex / u_texsize; v_fade_tex = / 255.0, mediump lowp\\n#def float float float float main() {\\n\\n float alpha = 0.5;\\n\\n gl_FragCol = vec4(0.0, 1.0, 0.0, 1.0) * alpha;\\n\\n if > u_zoom) {\\n gl_FragCol = vec4(1.0, 0.0, 0.0, 1.0) * alpha;\\n }\\n\\n if (u_zoom >= v_max_zoom {\\n gl_FragCol = vec4(0.0, 0.0, 0.0, 1.0) * alpha * 0.25;\\n }\\n\\n if >= u_maxzoom) {\\n gl_FragCol = vec4(0.0, 0.0, 1.0, 1.0) * alpha * 0.2;\\n highp lowp\\n#def vec2 vec2 vec2 mat4 float float float main() {\\n gl_Positio = u_matrix * vec4(a_pos + a_extrude / u_scale, 0.0, 1.0);\\n\\n v_max_zoom = a_data.x;\\ = vec4 values, const float t) {\\n if (t 7)return[n have been deprecated as of v8\")];if(! in \"%s\" not strict\";va a(l,e,\"arr expected, %s a(l,e,\"arr length %d expected, length %d r?[new have been deprecated as of v8\")]:[];v n(e,r,\"obj expected, %s found\",a)] o=[];for(v s in must start with \"@\"'));ret strict\";va one of [%s], %s strict\";va t(e){var n(l,s,\"arr expected, %s n(l,s,'\"$t cannot be use with operator n(l,s,'fil array for operator \"%s\" must have 3 expected, %s key cannot be a functions not functions not strict\";va url must include a \"{fontstac url must include a \"{range}\" strict\";va n(c,r,'eit \"type\" or \"ref\" is i(e,r,\"%s is greater than the maximum value strict\";va n(e,r,\"obj expected, %s f in r){var property in n(e,r,'mis required property strict\";va i(e,o,'unk property strict\";va n(r,e,'\"ty is e)for(var c in a(t){retur Sans Unicode MS new new M=new in n){for(var many symbols being rendered in a tile. See many glyphs being rendered in a tile. See exceeds allowed extent, reduce your vector tile buffer size\")}ret new new Error(\"Inv LngLat object: (\"+t+\", new new x(){return y(){return point(){re new new new new instanceof 0===s&&voi a(void new Error(\"fai to invert strict\";va n={\" strict\";va s(t){retur l(t,e,r,n) o=(new out of n(t,e){ret mapbox: ([\\w]+) ([\\w]+) ([\\w]+) a=new n?e(new Error(\"Inp data is not a valid GeoJSON t.data)ret e(new Error(\"Inp data is not a valid GeoJSON e(new Error(\"Inp data is not a valid GeoJSON e=0;ee)){v y;for(y in in p)c[y]=!0; t in new new i(t,e,i){v r(t,r){ret delete e(t);var n=new o(new new e=new in tile source layer \"'+M+'\" does not use vector tile spec v2 and therefore may have some rendering g(t,L);var F in B in n=new t.time>=(n void void t=new new i;var strict\";va new Error(\"Inv color o[e]}throw new Error(\"Inv color void n in r in Error('Sou layer does not exist on source \"'+e.id+'\" as specified by style layer t in t.id});for new Error(\"Sty is not done new Error(\"The is no source with this ID\");var delete instanceof this;var 0===e)thro new Error(\"The is no layer with this ID\");for(v r in this;var void 0===i||voi 0===a?void strict\";va i(t){retur t.value}va r,n;for(va i in t){var in for(n in in in 0===e)dele 0===e)dele o}var strict\";va new t){var this.grid= a}if(r){va _=u;for(va a}}}return r=new r(\"glyphs > 65535 not i=!t&&new l(new c(new g(e,r){var y(e,r){var i(0,0));re M in a)t[M]=new strict\";va t){var | n(){}var i(t){retur new 61:case 107:case 171:case 189:case 109:case t=0,e=0;re t=new null!==t&& new Error(\"max must be between the current minZoom and 20, t,e={};ret t instanceof e;if(t instanceof instanceof c?t:new i(this,e); void Error(\"Fai to initialize s in if(void if(void n(t){var r=new n(t){for(v e=0;e1)for delete error c(t,e,r){v f(t,e){for t in null;var delete new Error(\"An API access token is required to use Mapbox GL. See new Error(\"Use a public access token (pk.*) with Mapbox GL JS, not a secret access token (sk.*). See t}function i(t){retur a(t){retur t;var n(t){funct v[n];void in t=0;t=1)re 1;var void t={};for(v e in =0.22.0 =0.22.0 No README data run build-docs # invoked by publisher when publishing docs on the mb-pages --debug --standalo mapboxgl > && tap --no-cover build --github --format html -c --theme ./docs/_th --output --debug -t unassertif --plugin [minifyify --map --output --standalo mapboxgl > && tap --no-cover --debug -t envify > --ignore-p .gitignore js test bench diff --name-onl mb-pages HEAD -- | awk '{print | xargs build-toke watch-dev watch-benc build-toke watch-benc build-toke watch-dev run build-min && npm run build-docs && jekyll serve --no-cache --localhos --port 9966 --index index.html .\",test:\"n run lint && tap --reporter dot test/js/*/ && node && watchify bench/inde --plugin [minifyify --no-map] -t [babelify --presets react] -t unassertif -t envify -o bench/benc --debug --standalo mapboxgl -o n=new r=new r(t){var n(t,n){var i(t){retur t)return t){var 1=0)return V=1;V specify vertex creation specify cell creation specify phase strict\";va n(t){if(t in l)return l[t];for(v new Invalid boundary dst;};retu t in l){var t in u){var t in c){var return \"+s),u){va p=new p=new p()}functi for(var o=0;o1)for f(e,r){var s=\"__l\"+ i=\"__l\"+ _=[\"'use L=new L=new L(r)}funct s(t,e){var r=[\"'use [2,1,0];}e [1,0,2];}} [2,0,1];}e new new function new o=new 0===t){var 0===r){r=n o(t,e){var s(t,e){ret a(t,e){var i=new t||\"up\"in strict\";va r=void 0!==r?r+\"\" e(t,e){for t}function o)throw new to path.resol must be t)throw new to path.join must be n(t){for(v new Error(\"Giv varint doesn't fit into 10 bytes\");va o(t,e,r){v s(t,e){for new type: void n(t){var 0:return r||[];case 1:return 2:return Array(t);v r}var r(t,e){var Array(a),n n(t,e){for a(t){for(v t-e});var new t instanceof i(t){retur a(t){for(v a=1;i;){va l(t){for(v c(t){retur d(t){var u(m)}funct p(t){var 0x80 (not a basic code x});else for(_ in n(t,e){ret o;var o};var n(t,e){for n&&void e(t){var e=new Error(\"(re \"+t);throw n(t){retur t?\": i(t,r,i){t in r||e(\"unkn parameter possible values: parameter type\"+n(r) must be a typed parameter type\"+n(i) expected \"+r+\", got \"+typeof t)}functio parameter type, must be a nonnegativ shader source must be a string\",a) number \"+t+\": r=0;e(c(\"| compiling \"+s+\" shader, linking program with vertex shader, and fragment shader i(t){retur M(t,r){var n=m();e(t+ in command called from \"+n))}func A(t,e,r,i) in e||M(\"unkn parameter possible values: parameter type\"+n(r) expected \"+e+\", got \"+typeof texture format for renderbuff format for L(t,e){ret z(t,e,n){v pixel arguments to document,\" manually specify webgl context outside of DOM not supported, try upgrading your browser or graphics drivers name must be string\");v $(t){var et(t,e){va _e:r=new we:r=new Me:r=new ke:r=new Ae:r=new Te:r=new Se:r=new null}retur n=0;n0){va t[0]){var buffer data\")}els shape\");va data for buffer p=new n(a);retur d=[];retur t=0;return t&&t._buff instanceof a(t){var e||(e=new Ge:case Xe:case Ze:case type for element bit element buffers not supported, enable first\");va vertex count for buffer a}var t&&t._elem instanceof pt(t){for( At(t){retu Tt(t,e){va Or:case Fr:case Rr:case jr:var texture type, must specify a typed St(t,e){re for(var s}return o*r*n}func texture texture unpack n){var must enable the extension in order to use floating point must enable the extension in order to use 16-bit floating point must enable the extension in order to use depth/sten texture must be an extension not extension not d(e,r,i){v m(){return K.pop()||n h}function y(t,e,r){v b(t,e){var e){var e){var e){var e){var e){var i(t,e){var arguments to format for c=new T(nr);retu format for C=new z=new I(){for(va for(var P={\"don't care\":$r,\" mipmap mipmap mipmap mipmap s3tc dxt1\":Mr,\" s3tc dxt1\":kr,\" s3tc dxt3\":Ar,\" s3tc atc\":Sr,\"r atc explicit atc interpolat pvrtc pvrtc pvrtc pvrtc etc1\"]=Pr) r=B[e];ret null});ret number of texture shape for z||\"colors render targets not color buffer must enable in order to use floating point framebuffe must enable in order to use 16-bit floating point framebuffe must enable to use 16-bit render must enable in order to use 32-bit floating point color color format for color format for extension not u=d=1;var for(D=new color attachment \"+a+\" is color attachment much have the same number of bits per depth attachment for framebuffe stencil attachment for framebuffe depth-sten attachment for framebuffe not resize a framebuffe which is currently in use\");var i;for(var shape for framebuffe must be be d||\"colors render targets not color buffer color color format for l=1;var a(t){var t=0;return vertex fragment shader\",n) a=i[t];ret a||(a=new o(o){var must create a webgl context with in order to read pixels from the drawing cannot read from a from a framebuffe is only allowed for the types 'uint8' and from a framebuffe is only allowed for the type 'uint8'\")) arguments to buffer for regl.read( too s(t){var r;return l(t){retur l}function jt(t){retu Nt(t){retu Bt(){funct t(t){for(v r(){functi n(){var e=a();retu n(){var new new m(t){retur v(t,e,r){v g(t,e,r){v y(){var ei:var ri:return ni:return ii:return ai:return c={};retur n=e.id(t); in c)return c[n];var b(t){var in r){var if(Di in n){var e}function x(t,e){var in r){var i=r[Pi];re framebuffe in n){var a=n[Pi];re framebuffe null}funct n(t){if(t in i){var in a){var \"+t)});var in in e?new s=o;o=new w(t){funct r(t){if(t in i){var r});return n.id=r,n}i in a){var o=a[t];ret null}var r(t,r){if( in n){var in i){var s=i[t];ret in n){var in i){var o=i[Ri];re in n){var t=n[ji];re Be[t]})}if in i){var r=i[ji];re in \"+n,\"inval primitive, must be one of Aa}):new in n){var vertex t})}if(Ni in i){var r=i[Ni];re vertex s?new vertex offset/ele buffer too l=new k(t,e){var o(e,n){if( in r){var o})}else if(t in i){var vi:case si:case oi:case Ai:case hi:case Ci:case xi:case wi:case Mi:case pi:return flag fi:return in \"+i,\"inval \"+t+\", must be one of di:return color attachment for framebuffe sent to uniform data for uniform a[r],\"inva uniform or missing data for uniform T(t,r){var a&&a,\"inva data for attribute offset for attribute divisor for attribute parameter \"'+r+'\" for attribute pointer \"'+t+'\" (valid parameters are in r)return r[s];var in '+a+\"&&(ty dynamic attribute if(\"consta in \"+a+'.cons === in S(t){var a(t){var parameter L(t,e,r){v C(t,e,r,n) z(t,e,r){v n=m(e);if( in r.state)){ c,h;if(n in in I(t,e,r,n) if(mt(u)){ l(t){var ua:case da:case ga:return 2;case ca:case pa:case ya:return 3;case ha:case ma:case ba:return 1}}functio attribute i(i){var a=c[i];ret a(){functi o(){functi vertex vertex vertex i(t){retur n(e){var n=r.draw[e s(t){funct e(t){var args to args to e(t){if(t in r){var e=r[t];del delete l(t,e){var regl.clear with no buffer takes an object as cancel a frame callback must be a h(){var callback must be a function\") event, must be one of Kt={\"[obje renderbuff renderbuff arguments to renderbuff r(){return i(t){var s(){return p.pop()||n o}function u(t,e,r){v c(){var t(){var new requires at least one argument; got none.\");va e.href;var \",e);var s=new o;n=-(i+a) null;var n(t){retur n(t){for(v R;};return i(t){var e=s[t];ret strict\";\"u n(t){for(v i}function h(t,e){for r=new r}function r=new l(e)}funct u(t){for(v e=s(t);;){ t=k[0];ret f(t,e){var r=k[t];ret n(t,e){var l}else if(u)retur l}else if(u)retur u;return i(t,e){ret t.y-e}func a(t,e){for r=null;t;) t;var r}function l(t){for(v n=d.index; n(t,e){var i(t,e,r,n) o(t,e){for r}function s(t,e){for m}function s[t];for(v new unexpected new failed to parse named argument new failed to parse named argument new mixing positional and named placeholde is not (yet) s[t]=n}var n(t){for(v Array(e),n Array(e),i Array(e),a Array(e),o Array(e),s x=new u(t){retur c(t){var h(t){retur f(t){var d(t,e){for r in t}function p(t){retur t.x}functi m(t){retur t.y}var time\");var r=\"prepare \"+t.length %d clusters in c)|0 p=new Array(r),m Array(r),v Array(r),g p=new o}function s}function T(t){retur n=z(t);ret t){var r={};for(v i in e={};for(v r in n(t,e){var i(t,e){var s/6}return 1}var n&&void e(t,e){var for(a=0,n= n})}}var s;var in new Error(\"n must be new Error(\"alr s(t){retur new l(t){retur new u(t){retur new c(t){retur new h(t){retur new f(t){retur new d(t){retur new p(t){retur new m(t){retur x?new v(t){retur new n(t)}var null}retur t=0;tn)ret instanceof n)return t;var i=new n;return a(t){retur instanceof o(t,e){ret s(t,e){ret new 'url' must be a string, not \"+typeof t);var i(t,e){var a(t,e){var o(t,e){ret t}function s(t){var e={};retur a;var v=e.name?\" c(e)}var o+\": \"+s}functi d(t,e,r){v n=0;return \")+\" \"+t.join(\" \")+\" \"+t.join(\" \")+\" p(t){retur t}function v(t){retur g(t){retur t}function t}function t}function _(t){retur void 0===t}func w(t){retur M(t)&&\"[ob k(t){retur M(t)&&\"[ob A(t){retur instanceof t}function S(t){retur t||void 0===t}func E(t){retur L(t){retur t=a)return new Error(\"unk command if(7!==r)t new Error(\"unk command i(t){for(v e}var new Error(\"fea index out of new new String too long (sorry, this will get fixed later)\");v l(t){for(v e(t){var e=n(t);ret e?u in r(t,e){var o(t){var i?u in i&&delete t){var r?r[0]:\"\"} n?!r&&en)t al-ahad\",\" {0} not {0} {0} {0} mix {0} and {1} a(t,e){ret ;var format a date from another number at position name at position literal at position text found at dd M MM d, d M d M d M d M yyyy\",RSS: d M a=this;ret var _inline_1_ = - var _inline_1_ = - >= 0) !== (_inline_1 >= 0)) {\\n + 0.5 + 0.5 * (_inline_1 + _inline_1_ / (_inline_1 - }\\n n(t,e){var r=[];retur strict\";va u(r,i){ret i(t,e){var void E.remove() void null;var strict\";va void c();var t}function i(t){var e=x[t];ret a(t){retur the calendar system to use with `\"+t+\"` date data.\"}var i={};retur t}var i?\"rgba(\"+ n=i(t);ret t){var A(e,r){var T(){var void strict\";va strict\";va strict\";va strict\";va strict\";va strict\";va n(){var e(e){retur r;try{r=ne strict\";va i(t,e,r,n) a(t){var void n.remove() void \")}).split \")}).split scale(\"+e+ n,i,a;retu strict\";va 0 1,1 0 0,1 \"+a+\",\"+a+ 0 0 1 \"+a+\",\"+a+ 0 0 1 \"+r+\",\"+r+ 0 0 1 \"+r+\",\"+r+ 0 0 1 0 1,1 0 0,1 0 1,1 0 0,1 n(t,e,r,n) t.id});var strict\";va strict\";va i(t,e,r){v r(t){var void r.remove() r(e,r,o){v if(i[r]){v o;if(void strict\";va n(t){var n(r){retur strict\";va n(t){for(v \");var i(t,e){var click on legend to isolate individual l(t){var u(t){var strict\";va r[1]}retur i}function i(t){retur t[0]}var h(t){var f(t){var d(t){var n(t,e){var i(t){for(v n(t){for(v 0}}var o(t,e){var 0 1,1 0 0,1 extra params in segment t(e).repla strict\";va strict\";va u(r,i){ret r(t,e){ret l(t,e,r){v u(t,e,r){v c(t,e){var n(){return p(t,e){var g(t,e){ret y(t,e){ret b(t,e,r){v x(t,e){var _(t){for(v r(t,e){ret strict\";va strict\";va strict\";va t){var void t)return void void n}function l(t){retur u(t){retur c(t){retur d\")}functi h(t){retur d, yyyy\")}var t.getTime} r={};retur n=new a(t){retur o(t){for(v r={};retur n(){return strict\";va for(var c(t){retur void property r(t,e){var instanceof RegExp){va void o(t,e){ret t>=e}var binary r=e%1;retu n(t){var e=i(t);ret n(t,e){ret i(t){retur \")}functio a(t,e,r){v was an error in the tex null;var r=0;r1)for i=1;i doesnt match end tag . Pretending it did s}function c(t,e,r){v o(),void e();var 0,\":\"],k=n t(t,e){ret void n(t){var i(){var 1px new strict\";va strict\";va n(t,e){for r=new new Error(\"No DOM element with id '\"+t+\"' exists on the page.\");re 0===t)thro new Error(\"DOM element provided is null or previous rejected promises from t.yaxis1); array edits are incompatib with other edits\",h); full array edit out of if(void & removal are incompatib with edits to the same full object edit new Error(\"eac index in \"+r+\" must be new Error(\"gd. must be an 0===e)thro new is a required new Error(\"cur and new indices must be of equal u(t,e,r){v new Error(\"gd. must be an 0===e)thro new Error(\"tra must be in in i(t){retur a(t,e){var r=0;return new Error(\"Thi element is not a Plotly plot: \"+t+\". It's likely that you've failed to create a plot before animating it. For more details, see void c()}functi d(t){retur overwritin frame with a frame whose name of type \"number\" also equates to \"'+f+'\". This is valid but may potentiall lead to unexpected behavior since all plotly.js frame names are stored internally as This API call has yielded too many warnings. For the rest of this call, further warnings about numeric frame names will be addFrames accepts frames with numeric names, but the numbers areimplici cast to n(t){var i}function i(){var t={};retur a(t){var o(){var s(t){retur l(t){funct u(t){funct c(t){retur h(t,e,r){v f(t,e,r){v e={};retur t&&void n(t){retur Error(\"Hei and width should be pixel values.\")) l(t,e,r){v u(t,e,r,n) \"+o:s=o+(s dtick p(t,e){var c=new t.dtick){v error: t+i*e;var dtick a(t){for(v strict\";va v(r,n){ret to enter axis\")+\" e;var n(t,r){for n(t,e,r,n) u(t,e){ret y(t){var b(t,e,r){v back X(e,r){var K()}functi W(e){funct n(e){retur void k.log(\"Did not find wheel motion attributes \",e);var strict\";va n(t){retur t._id}func went wrong with axis Error(\"axi in in in o){var t(t){var e(t){retur strict\";va r(r,n){var e/2}}funct v(t,e){var g(t,e){var b(t,e){var x(t,e){var new Error(\"not yet r(t,r){for i(){for(va a(t,e){for n(t,e){var n(t){retur i(t,e,r,n) a(t,e){ret i(t,e){var r(t){retur n(t){var l(t,e){var u(t){var c(t,e){var f(t,e,r){v strict\";va i(t){var e=new n;return a(t){var o(t){var i=new n(t,e);ret strict\";va Sans Regular, Arial Unicode MS r(t,e){ret - delete t)return e,n,i={};f in i}return r=a(t);ret e&&delete P=(new + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + '' + 0px\",\"1px -1px\",\"-1p 1px\",\"1px \"+t+\" 0 \"+n+\" \"+n+\" \"+n+\" void c=\"t: \"+u.t+\", r: l;var r in t)r in r in r=e||6;ret 0===t)retu null;var void t(){var t={};retur n.mode,del strict\";va e(e,i){ret s;return t(t,e){ret i=r[n];ret strict\";va t(t,e,r){v e(t,e){ret r(t,e){ret a(t,i){var tozoom back f(t,e){var i(t){retur y,b;return o,s;return ii))return e}return void h(t){retur f(t,e){ret void strict\";va strict\";va 0, 0, strict\";va s;return r in void strict\";va null;for(v strict\";va o(e){var s(e){var strict\";va n(t,e,r){v i(t,e,r){v strict\";va converged strict\";va strict\";va strict\";va strict\";va s(r,i){ret n(t,e,r,n) strict\";va n(t,e){for o(t){retur strict\";va void strict\";va strict\";va c(r,i){ret loop in contour?\") s(t,e,r){v 15===r?0:r many contours, clipping at i}function a(t,e,r){v o(t,e,r){v s(t,e,r,n) e=l(t,r) r(t){retur to newendpt is not vert. or perimeter scale is not scale is not void data invalid for the specified inequality many contours, clipping at strict\";va strict\";va h(t){retur to newendpt is not vert. or perimeter o(t,e,r){v s(t,e,r){v scale is not scale is not strict\";va iterated with no new in strict\";va g}var didn't converge strict\";va s=0;sa){va in strict\";va l(r,n){ret u(t){var e=l(t);ret strict\";va strict\";va e(e){var strict\";va strict\";va void r(t,e){ret traces support up to \"+u+\" dimensions at the c}var l(r,n){ret strict\";va l(n){var i}function c(t,e,r){v l(t,e,r){v n=o(r);ret u(t,e){ret c(t){retur h(t){var e=o(t);ret f(t){var d(t){retur t[0]}funct p(t,e,r){v m(t){var v(t){retur l(t){var u(t){retur c(t,e){for e.t+\"px \"+e.r+\"px \"+e.b+\"px 255, 255, 0)\");var 1px 1px #fff, -1px -1px 1px #fff, 1px -1px 1px #fff, -1px 1px 1px strict\";va i(t,e,r){v strict\";va n(t,e){for m};var strict\";va o(r,a){ret strict\";va strict\";va strict\";va n(t,e,r){v u;var 1;var a(t,e){var r(t,e){ret n(t,e){ret s(t,e){var 1;var t+\" void strict\";va strict\";va strict\";va 0, i(t,e){var r=new for(r=new is present in the Sankey data. Removing all nodes and strict\";va u(r,a){ret n(t){retur t.key}func a(t){retur t[0]}funct o(t){var 0 0 1 0 0)\":\"matri 1 1 0 0 0)\")}funct M(t){retur k(t){retur 0 0 1 0 0)\":\"matri 1 1 0 0 0)\"}functi A(t){retur 1)\":\"scale 1)\"}functi T(t){retur S(t){retur L(t,e,r){v var C(t,e,r){v i(){for(va e={};retur 1px 1px #fff, 1px 1px 1px #fff, 1px -1px 1px #fff, -1px -1px 1px strict\";va _=new strict\";va void strict\";va strict\";va m(r,a){ret strict\";va strict\";va strict\";va r(e){var i(t){var strict\";va n(t,e){var + m(t){retur v(t){retur g(t){retur t.id}funct g}function x(e){var scatter strict\";va s(t,e){ret l(t){retur M[t]}funct o=0;o=0){v n(t,e,r,n) strict\";va d(r,i){ret s=o[0];if( 0;var v.push(\"y: strict\";va strict\";va e(t){retur r(t){var 1/0;var strict\";va n(t,e){var n}function s(t,e,r,n) n=new s(t){var 1/0;var strict\";va strict\";va strict\";va strict\";va d(r,i){ret strict\";va strict\";va e=f(t);ret e=f(t);ret e=f(t);ret e=f(t);ret Unconfirme transactio In\u00a0[4]: # import mempool data downloaded from mempool = pd.read_cs header=Non ) # split the datetime to date and time temp = = temp.date = temp.time del # reorder the columns cols = mempool = mempool[co inplace=Tr d2 = mempool = In\u00a0[5]: # there are 3 values per day. get average mempool size for each day mempool = The number of transactio waiting to be confirmed on the Bitcoin blockchain increased to an all time maximum on May 18th of 175,978. For comparison the average value in 2016 was less than\u00a010,00 Once the the number of unconfirme transactio had peaked, it fell about as quickly as it rose and by mid July was generally below 10,000\u00a0aga The current state of the unconfirme transactio pool along with the fee rates currently offered can be seen here. In\u00a0[6]: series1 = go.Scatter name='Dail average', line = dict( color = (color2), width = 2,)) series2 = go.Scatter name='Week average', line = dict( color = (color1), width = 3,)) data = [series1, series2] layout = go.Layout( transactio of yanchor='t y=1, x=0.5) ) fig = layout=lay py.iplot(f Out[6]: Median transactio confirmati time (minutes)\u00b6 would expect that the average time taken to confirm a transactio will increase with the size of the unconfirme transactio pool. The figure below shows the median time in minutes for a new transactio to be\u00a0confirm In\u00a0[7]: # The Daily Median time taken for transactio to be accepted into a block, presumably in minutes ATRCT = ATRCT = In\u00a0[8]: series1 = go.Scatter name='Dail median', line = dict( color = (color2), width = 2)) series2 = go.Scatter name='7 day average', line = dict( color = (color1), width = 3)) data = [series1, series2] layout = go.Layout( title='Med time taken for transactio to be accepted into a block', (minutes)' yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[8]: The median transactio confirmati time does not increase noticeably when the pool of unconfirme transactio increases, in fact the two features have only a weak Pearson correlatio of 0.37 (details below). This is surprising because I expected that the time taken to confirm a transactio would increases when the pool of transactio waiting to be Perhaps this is because only valid transactio can be confirmed and included in the median average calculatio but invalid transactio are included in the pool of transactio awaiting confirmati One way to test this would be to query the transactio awaiting confirmati and quantify if they are valid and what fee rate they are\u00a0offeri Average block size (daily, MB)\u00b6Each block in the Bitcoin network had a maximum size of 1MB before 1 August 2017. As the Bitcoin network has grown and transactio volume has increased the blocksize limit began to limit Was the increase in unconfirme transactio correlated to the blocks getting \u201cfilled up\u201d to their maximum 1MB\u00a0size? In\u00a0[9]: # The Average block size in MB AVBLS = In\u00a0[10]: av_bs = del av_bs['Val In\u00a0[11]: series1 = go.Scatter line = dict( color = (color2), width = 2)) series2 = go.Scatter name='7 day average', line = dict( color = (color1), width = 3)) data = [series1, series2] layout = go.Layout( title='Blo size', size (MB)'), yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[11]: From March through June the blocksizes seem to have frequently hit their maximum possible size, suggesting that the Bitcoin network was processing the maximum amount of data possible. The increase in unconfirme transactio occurred from mid-April to end of\u00a0June. The average block size began a sharp decrease on July 2nd, and at the same time the median transactio confirmati time also began a quick reduction. By July 2nd the number of unconfirme transactio had already fallen back to (Not all transactio are the same size, as a transactio can have any number of outputs and inputs, and a transactio with many inputs and/or outputs would be a larger amount of data than a transactio with only 1 input and 1 or 2\u00a0outputs. Lets confirm if the number of transactio increased over the same\u00a0perio Average number of transactio per (1MB) block\u00b6 In\u00a0[12]: # The average number of transactio per block. each day? NTRBL = NTRBL = In\u00a0[13]: series1 = go.Scatter name='Aver transactio per block', line = dict( color = (color2), width = 2)) series2 = go.Scatter name='7 day average', line = dict( color = (color1), width = 3)) data = [series1, series2] layout = go.Layout( title='Ave number of transactio per block', of yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[13]: The average number of transactio per block hit a peak at the end of May 2017 and then saw two sharp declines. It fell quickly at the beginning of June and then again at the beginning of\u00a0July. In June the blocksizes remained more or less as large as possible which suggests the blocks were full of a few large transactio At this time the size of the mempool was At the beginning of July the number of transactio per block reduced and the average blocksize was also rapidly reducing. This suggests that the volume of smaller transactio had\u00a0reduce The difference in average blocksizes in early June and early July suggests that in early June the number of transactio reduced because the average size of transactio had increased, but in July the number of transactio per block reduced because fewer transactio were being\u00a0crea Perhaps Bitcoin exchanges and other organisati with high transactio volumes had changed their behaviour and begun posting larger transactio with many inputs and/or outputs, rather than posting many smaller transactio with fewer inputs Bitcoin is often held by speculator who expect the value of a Bitcoin to increase. Perhaps increases in transactio volume are correlated to increases in Transactio fees earned by miners each fees are charged to users sending Bitcoin. Node operators (miners) collect unconfirme transactio confirm their validity and perform the proof-of-w requiremen to submit these transactio as a new block of In order to provide an incentive for node operators to process and confirm new transactio and to compensate for the equipment and energy costs required to do so, a fee is charged to confirm each transactio The size of the fee is proportion to the size (in bytes) of the transactio and is quantified as the fee rate otherwise miners would prefer smaller sized transactio as they could fit more into each\u00a0block The pool of unconfirme transactio is automatica sorted by transactio fee rate, so that miners confirm transactio with a higher fee rate before those with a lower fee\u00a0rate. Because of this, it is expected that as the number of unconfirme transactio increases, the fees paid to ensure a transactio gets processed will also increase. This is shown in the figure\u00a0bel Perhaps one reason the number of unconfirme transactio grew was because the fee rate offered for many of these transactio was below some threshold where it wasn\u2019t worth the miners efforts to confirm\u00a0th The total value of confirmati fees earned per day and the size of the unconfirme transactio pool are plotted\u00a0be In\u00a0[14]: # transactio fees - the total BTC value of transactio fees miners earn per day. TRFEE = In\u00a0[15]: tn_fee = del In\u00a0[16]: trace1 = go.Scatter transactio ) # used later trace2 = go.Scatter fee', yaxis='y2' ) data2 = [trace1, trace2] layout = go.Layout( title='Tot value (BTC) of transactio confirmati fees earned each day', yanchor='t y=1.1, x=0.5), xaxis=dict ticklen=7, ), yaxis=dict title='Num of unconfirme transactio zeroline=T autotick=T ticks='', ), yaxis2=dic title='Dai sum of confirmati fees (BTC)', side='righ ) ) fig = layout=lay py.iplot(f Out[16]: It looks as if confirmati fees correlate positively to the number of unconfirme transactio This is expected as users would need to pay higher fees when there are a lot of unconfirme transactio in order to have their transactio moved towards the front of the queue and processed However it looks as if changes to a miners fee rate lags behind changes in the size of the unconfirme transactio pool by about 2 weeks. The variation in the transactio fee is also a lot smaller than variation in the size of the unconfirme This suggests that the method for calculatin transactio the fee rate could be improved so that fee rate responds faster to changes in the number of transactio awaiting confirmati This would make mining less profitable and more competitiv and would make the Bitcoin network cheaper for\u00a0users. Lets look at how expensive it is to use the Bitcoin network by analysing the transactio fee rate relative to Ratio of transactio fees to transactio volume\u00b6 In\u00a0[17]: # The Average transactio confirmati fee rate (%) CPTRV = CPTRV = In\u00a0[18]: series1 = go.Scatter name='Fee rate', line = dict( color = (color2), width = 2)) series2 = go.Scatter name='7 day average', line = dict( color = (color1), width = 3)) data = [series1, series2] layout = go.Layout( title='Min revenue as as percentage of the transactio volume', rate (%)'), yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[18]: The results show that a fee rate (Miners Volume) of 0.5-1% is typical on the Bitcoin network. This is a bit cheaper than ecommerce payment methods. Surprinsin there is a correlatio of -0.25 with the number of unconfirme transactio This means the fee rate decreases when the number of unconfirme transactio increases. The correlatio is weak. One possible explanatio for this may be that activity on the network increases when the price of Bitcoin increases. When the price of Bitcoin increases, more resources are allocated to mining because it is increasing profitable Also, more people decide to buy Bitcoin because it\u2019s becoming so valuable. This leads to more transactio but even more miners competing to confirm transactio and claim the rewards. This increase in supply drives down the transactio confirmati fee\u00a0rate. Lets see how the number of transactio per day has changed in 2017 so\u00a0far. Number of transactio per day\u00b6 In\u00a0[19]: # Number of Transactio from Popular Addresses NTREP = # excluding popular addresses NTRAN = # from all addresses NTREP = #excl. popular NTRAN = #all addresses NTRFP = NTRAN - NTREP # Popular only NTRFP['all = NTRAN['Val NTRFP['unp = NTREP['Val NTRFP['pop = NTRFP['all - NTRFP['unp #NTRFP.hea In\u00a0[20]: series1 = go.Scatter name='From all addresses' line = dict( color = (color2), width = 2)) series2 = go.Scatter name='From all addresses - 7 day average', line = dict( color = (color1), width = 3)) series3 = go.Scatter excluding 100 most popular addresses' yaxis='y1' line = dict( color = ('#CEB7DF' width = 2)) series4 = go.Scatter excluding 100 most popular addresses - 7 day average', yaxis='y1' line = dict( color = ('#830DD4' width = 3)) series5 = go.Scatter - 7 day average', yaxis='y2' line = dict( color = ('#4FA6D4' width = 3)) data = [series1, series2, series3, series4, series5] layout = go.Layout( title='Bit transactio per day', xaxis=dict ticklen=5, ), yaxis=dict per day', zeroline=T autotick=T ticks='', ticklen=7, ), yaxis2=dic side='righ ), legend=dic y=-0.45, x=0, ) ) fig = layout=lay py.iplot(f Out[20]: The figure above shows the number of transactio posted each day from all addresses, and the number of transactio each day from addresses excluding the 100 most popular addresses. The difference between the two (the number of transactio from the 100 most popular addresses) is shown in blue using the axis on the\u00a0right. There is a positive correlatio with the size of the unconfirme transactio pool. Interestin there is a stronger correlatio with transactio created by the 100 most popular addresses (0.54) than for unpopular addresses (0.46). Possible reasons for this are Finally, lets consider the influence of the price of Bitcoin on the size of the unconfirme Bitcoin price\u00b6 In\u00a0[21]: # The USD value of BTC MKPRU = MKPRU = In\u00a0[22]: series1 = go.Scatter name='Dail line = dict( color = (color2), width = 2)) series2 = go.Scatter name='7 day average', line = dict( color = (color1), width = 3)) data = [series1, series2] layout = go.Layout( title='Bit price', (USD)'), yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[22]: Apart from showing a notable increase of around 500% in 13 months, the price has a correlatio of just 0.42 with the size of the unconfirme The number of transactio coming from popular addresses is positively correlated (0.41) to Bitcoin price, suggesting that when the Bitcoin price increases trading activity on exchanges also increases. Transactio from less popular addresses is inversely correlated to Bitcoin price (-0.31) and this could be because when the Bitcoin price surges, individual holding Bitcoin do not want to spend Bitcoin to make purchases, and will need to use an exchange to convert fiat currencies into\u00a0Bitco Note that ordinarily a new address is used for each Ratio of unique addresses to I want to compare the number of unique Bitcoin addresses used to the total number of transactio created. I initially expected the ratio of addresses to transactio to be close to 1, not realising that each transactio will contain at least 2 addresses (1 input and 1 output, and probably a 2nd output address which is equal to the input address for the change). If each transactio on average has 2 outputs, then the idea ratio of Bitcoin transactio to addresses will be\u00a00.5. In\u00a0[23]: # unique addresses used each day NADDU = NADDU = #number of transactio is NTRAN RATIO = NTRAN / NADDU d1 = d2 = RATIO = In\u00a0[24]: series1 = go.Scatter name='Dail line = dict( color = (color2), width = 2)) series2 = go.Scatter name='7 day average', line = dict( color = (color1), width = 3)) data = [series1, series2] layout = go.Layout( title='Rat of transactio to unique addresses' yanchor='t y=1.1, x=0.5) ) fig = layout=lay py.iplot(f Out[24]: The figure above shows that the ratio of unique Bitcoin transactio to unique addresses approaches 0.5. If users reuse an address for multiple transactio (which is bad) then the ratio will rise above 0.5, and if users create transactio with more than the usual minimum of 2 unique addresses then the ratio will dip below\u00a00.5. Correlatio between each time series\u00b6The table below shows the Pearson correlatio coefficien between each time In\u00a0[25]: '''' # using daily averages tseries = [ av_bs['Siz tn_fee['Fe RATIO['Val ] cols = ['Unconf trnsx', 'Conf time', 'Block size', 'Trnsx/blo 'Conf fees', 'Fee rate', 'USD/BTC', 'Trnsx - pop addrs', 'Trnsx - unpop addrs', 'Tnsx : Addrs ratio'] tbl = len(tserie for i in for j in tbl[i,j] = # values index=cols # 1st column as index columns=co # 1st row as the column names '''; In\u00a0[26]: # Using 7 day moving average tseries = [ cols = ['Unconf trnsx', 'Conf time', 'Block size', 'Trnsx/blo 'Conf fees', 'Fee rate', 'USD/BTC', 'Trnsx - pop addrs', 'Trnsx - unpop addrs', 'Tnsx : Addrs ratio'] tbl = len(tserie for i in for j in tbl[i,j] = # values index=cols # 1st column as index columns=co # 1st row as the column names Out[26]: .dataframe thead tr:only-ch th { text-align right; } .dataframe thead th { text-align left; } .dataframe tbody tr th { top; } Unconf trnsx Conf time Block size Trnsx/bloc Conf fees Fee rate USD/BTC Trnsx - pop addrs Trnsx - unpop addrs Tnsx : Addrs ratio Unconf trnsx 1.000000 0.512825 0.572234 0.662592 0.751183 -0.318481 0.476519 0.684877 0.732961 0.042633 Conf time 0.512825 1.000000 0.875708 0.821096 0.652057 -0.347289 0.519717 -0.016592 0.466974 0.234847 Block size 0.572234 0.875708 1.000000 0.856833 0.748027 -0.435521 0.624602 0.387743 0.522778 -0.082361 Trnsx/bloc 0.662592 0.821096 0.856833 1.000000 0.619106 -0.513678 0.332789 0.263169 0.915094 0.402304 Conf fees 0.751183 0.652057 0.748027 0.619106 1.000000 -0.326240 0.824995 0.661089 0.378880 -0.363753 Fee rate -0.318481 -0.347289 -0.435521 -0.513678 -0.326240 1.000000 -0.217555 -0.591717 -0.393656 0.126348 USD/BTC 0.476519 0.519717 0.624602 0.332789 0.824995 -0.217555 1.000000 0.523433 -0.354745 -0.716468 Trnsx - pop addrs 0.684877 -0.016592 0.387743 0.263169 0.661089 -0.591717 0.523433 1.000000 0.244647 -0.409759 Trnsx - unpop addrs 0.732961 0.466974 0.522778 0.915094 0.378880 -0.393656 -0.354745 0.244647 1.000000 0.471671 Tnsx : Addrs ratio 0.042633 0.234847 -0.082361 0.402304 -0.363753 0.126348 -0.716468 -0.409759 0.471671 1.000000 if { var mathjaxscr = = = = ? \"innerHTML : \"text\")] = + \" config: + \" TeX: { extensions { autoNumber 'AMS' } },\" + \" jax: + \" extensions + \" displayAli 'center',\" + \" displayInd '0em',\" + \" showMathMe true,\" + \" tex2jax: { \" + \" inlineMath [ ['$','$'] ], \" + \" displayMat [ ['$$','$$' ],\" + \" true,\" + \" preview: 'TeX',\" + \" }, \" + \" 'HTML-CSS' { \" + \" linebreaks { automatic: true, width: '95% container' }, \" + \" styles: { .MathJax .mo, .MathJax .mi, .MathJax .mn': {color: 'black ! important' }\" + \" } \" + \"}); \"; (document. || }"},{"title":"Corporate\u00a0London","category":"Non-technical/Journal","url":"corporate.html","date":"10 July 2017","tags":"career, corporate, london ","body":"2014 -\u00a02017 Having made arrangemen to leave London, it seems like a reasonable time to reflect on my time\u00a0here. London is a tough city to live in \u2014 it\u2019s big enough that a sub 1-hour commute is considered good, and it\u2019s super expensive. My stay in London has been defined by my quest to complete my graduate scheme and qualify as an\u00a0account When I first arrived I had no idea what London or working in a corporate would be like. I was coming from academia, and my motivation for moving to finance could be summed up in two\u00a0points Understand more about the 2008 Get paid more for using mostly the same skills (math) as in\u00a0enginee I don\u2019t think there\u2019s anything wrong with these motivation but I should have been making a long term career plan\u00a0inste When I started my job I was surprised at the 6 weeks of induction and hot-deskin Life as an auditor felt nomadic, as everyone would spend large amounts of time at client\u2019s offices, and no-one has their own desk in our office. The only thing you really own is your knowledge and your network In 2014, I arrived with a high opinion of my employer and the view that I would be staying for several years. Having escaped the financial insecurity of short-term research grants, I\u2019d moved back to my home country to contribute to the system that educated me. I was happy to have a regular job with a decent, A phrase that kept coming to mind back then was \u201caccountan factory\u201d. Our training materials were scripted and everything was a standardis process. We were being processed. Graduates in, corporate The company is huge and so are the efficienci and barriers to competitio that come with this. When I arrived I was impressed there were free biscuits, and it felt presumptuo to put a meeting in someones diary. Now I ignore any communicat addressed to a mailing\u00a0li Accountant The toughest experience were related to the accountanc qualificat Na\u00efvely I had believed that the qualificat wouldn\u2019t be a big deal and I didn\u2019t give any thought to it when I applied \u2014 I just wanted to find out how banks worked. If there were some exams to be done then it would be fine, I\u2019d already achieved a PhD and I would handle\u00a0it. That turned out to be a mistake. A huge mistake. The alarm bells should have rung louder when I realised most of my graduate colleagues had only applied so that they could get the qualificat (and were already intending to leave asap afterwards Doing the pre-course work before college wasn\u2019t trivial. College - I was being sent to a classroom again. We were being taken out of the office, away from clients, and put in a classroom to prepare for these\u00a0exam Starting a job that required becoming a chartered accountant without considerin the effort and time involved to qualify was dumb. It\u2019s an oversight I find hard to\u00a0believe The disappoint and sadness at having to study again was profound. The teaching and assessment style was several steps behind what I\u2019d become used to. Compared to the depth, autonomy and research-f of a doctorate, writing cookie-cut essays in time-press exams felt stupid. The exams were hard and I failed several of them, probably because of my low morale whilst revising. Retaking them required more weekends and annual leave being spent away from my family, camped-out in libraries and\u00a0office This required much patience and generosity from Ritsya, who having not seen me for long periods during my doctorate expected us to have a more normal lifestyle when we came to London. I found it miserable to pause my social life and other interests whilst studying, and then I would try to rush back to them when I had the time, knowing that soon I\u2019d have to pause them\u00a0again If I hadn\u2019t already studied in Vienna, this process of working and studying in London would have been exciting and felt valuable or special. I\u2019d passed enough exams though to know that they\u2019re never as important as they\u2019re made out to be, and whatever it was that was missing from my life wasn\u2019t going to be found in On the upside though, the ACA is the most practical qualificat I\u2019ve gained and has taught me many useful aspects of business and finance. I\u2019m glad to finally see business not as some mysterious system but something attainable Neighbourh Living in London has also required a lot of time on trains \u2014 my door to desk commute is about an hour each way, and I\u2019ve never lived somewhere in London that hasn\u2019t felt transient. We\u2019ve needed to live near a station and didn\u2019t want to commute more than an hour and that\u2019s put us in neighbourh with other young profession who also don\u2019t have long term plans (or financial ability) to stay in the city. We\u2019re all looking to move on and move up as fast as possible. I want to leave London partly because there are so many people with the same attitudes and priorities as my\u00a0own\u2026 Back in the office I spent the first few months figuring out how people were organised, how teams operated and how decisions were made. I think it took me about a year to feel like I understood how things really worked, and 1.5 years to feel like I could do all aspects of my job with certainty. There is a difference between how things are spoken about, and how things My experience has been that performanc is all that matters and behaviour is defined by self-inter I don\u2019t think this is different from previous environmen I\u2019ve worked in \u2014 academia isn\u2019t any different1 and the constructi industry certainly isn\u2019t. I wonder if the only way people can endure it is if they\u2019ve not experience anything else, or think it normal, or necessary. I find myself wishing we could be kinder to each other, and that we could create structures that incentivis Corporate finance usually appears clean and well presented. Its people appear dependable and capable. The culture is sanitised and there\u2019s a lot of pressure to conform - you can see it in the clothes and accessorie we wear and the jokes we tell each other in the\u00a0cantee I came to see my office as a glass and steel cathedral. I read about the Middle East, and migrants arriving in Europe, and tower blocks burning, and felt Despite the high regard with which we hold ourselves, we don\u2019t ask each other how we can help. Maybe we are too busy, or feel powerless to help. Apparently we do not know how to help, despite our wealth, talents, education I remember watching a man leave the office one day and thought that if he gave his whole career and left his firm after decades of service, there would be nothing left to identify him after a few weeks. The work will always get done. Market forces will dictate how the business adapts and grows. The City rolls\u00a0on. Publicatio trump almost all other metrics, and the incentives to publish quantity over quality are such that risky or slow research is inexcusabl The metrics used to standardis success and allocate resources can be subverted just as well in academia as in any other industry. What management measures, the team prioritise"},{"title":"Blockchains from the ground up: Part\u00a02","category":"Technical/Cryptocurrencies","url":"blockchain-networks.html","date":"8 July 2017","tags":"blockchains, digital currencies, distributed ledger technology, distributed consensus, sybil ","body":"Maintain an accurate list of transactio across a large group of users, without a This is part 2 of an introducti to the key features of a generalise blockchain Part 1 introduced key features of immutable record creation between 2 parties using public key cryptograp Part 2 explores how a network of users can maintain the same (true) list of transactio and protect each other against\u00a0fr Broadcasti transactio to the\u00a0networ In Part 1 we saw Lizzie, John and Chris exchanging coins. Lizzie also paid John with coins that were owed to her by Chris. These transactio were authentica using PKI\u00a0which: Ensured Prevented participan claiming that they didn\u2019t make a Prevents anyone creating a transactio on someone else\u2019s behalf without their\u00a0cons As the number of people in the network grows, the transfer of coins from one user to another becomes harder to track. If every users ledger is not identical then the opportunit arises to use coins that have already been spent to pay someone who doesn\u2019t know they\u2019ve already been\u00a0used. This is double spending, and is possible because the ledger that is shared amongst all members of the group only has weak consistenc - it is not necessaril correct all the time in all\u00a0locati Weak consistenc could be solved by requiring that everyone votes to accept a transactio before it is accepted into the ledger (Unanimous consensus) or to save time we could reduce the requiremen so that only 50% of all users validate a transactio before it is accepted into the ledger (Quorum consensus) Either of these solutions is possible for a small local group with a list of all\u00a0users. However Unanimous or Quorum Consensus doesn\u2019t solve the weak consistenc problem\u00a0if The group is\u00a0large The group is small but spread across different locations or\u00a0timezon It is not possible to know how many members there are and therefore what proportion of users The real identity of a user is\u00a0unknown In these cases a peer-to-pe network is required where transactio between users require approval by other users before being confirmed. This has not been trivial to solve, as some users would be incentivis to be dishonest, and some may make mistakes. This is the distribute consensus problem, which on wikipedia is defined\u00a0as The consensus problem requires agreement among a number of agents for a single data value. Some of the processes (agents) may fail or be unreliable in other ways, so consensus protocols must be fault tolerant or resilient. The processes must somehow put forth their candidate values, communicat with one another, and agree on a single When the number and identity of participan is known, distribute consensus is possible. Two types of protocol which allows all users in a distribute system to agree on a transactio are the Paxos family of protocols and the Two-phase commit protocol. Both of these would require that at least 50% of all users reach agreement in order to add a However in a public peer-to-pe network the total number of active users is not known - its fast and cheap to create new user profiles, and existing user profiles may become dormant. This makes it impossible to know how many users 50% would be. Additional because its possible to cheaply create new user profiles (just generate a new public-pri key pair), a single actor could generate and control many user accounts in order to have many votes and force incorrect transactio onto the ledger. An attack where one user subverts a network by creating many profiles is known as a Sybil attack. Proof of\u00a0Work The solution to the Sybil attack is to increase the cost of verifying a transactio such that the cost exceeds the reward. This is achieved through proof-of-w (PoW) algorithms which are expensive for a sender claiming to have verified a transactio and simple for the receiver to verify that the sender has validated One possible Proof of Work approach is to require that the hash of a verificati message begins with a certain set of characters The chosen set of characters is called a nonce and the only way to create a verificati message with an acceptable hash is to try many slightly different messages. For example, a nonce may be 3 zeros. It\u2019s arbitrary, but the longer the nonce is the more difficult it becomes to find a hash that fits This is because a hash is a random list of characters and altering even a single part of the data being hashed will result in a completely different hash value. Therefore there is no way to predict a hash value. The only way to generate a hash with the required none is to repeatedly alter the data being hashed (even by just one character) until a hash with the required features is randomly achieved. This is expensive to achieve, but simple to\u00a0verify. Using the method, a user who seeks to verify a transactio and broadcast the result must (once they\u2019ve verified the transactio repeatedly try different messages until they randomly find a message that meets the nonce requiremen It is simple for a user to check if a transactio verificati message meets the nonce requiremen because it is simple to inspect a hash and compare it to the\u00a0nonce. The effect of this requiremen is a process that makes it expensive to claim that a transactio has been verified and cheap to check that verificati claim. This removes the threat of a Sybil attack, but does not remove the distribute consensus problems created by not\u00a0knowin The true identity of users in the\u00a0networ How many users\u00a0exis This problem cannot be completely solved, and the practical solution is to relax the requiremen such that the probabilit of accepting a fraudulent transactio is lower than some user defined threshold. This is acceptable because a user would require a higher degree of confirmati for a high-value transactio than they would for a low-value transactio and would therefore be willing to incur more time and cost to verify a high value transactio and reduce the probabilit of accepting an incorrect transactio below a\u00a0threshol If a user wishes to make fast or low-value transactio or trusts the party they\u2019re transactin with, then they may accept a transactio without any other users on the network verifying that the sender has the required However when the senders is not assured, verificati is required. The more risky or valuable the transactio the more users the receiver of the funds will ask to verify that the sender has access to the required funds. The higher the number of users, the higher the probabilit that a dishonest transactio will be identified before An appropriat level of verificati will depend on the amount being transferre and how well the receiver of the funds knows the\u00a0sender Asking peers on the network to verify transactio introduces a new problem. Verifying a transactio requires time and effort, and incurs a cost. This cost requires that network participan be rewarded for correctly verifying transactio between An attacker would only attack if the cost is less than the reward. Therefore the number and cost of verificati required should be just enough to make the cost of an attack more than the value of This introduces the problem that it costs more to verify a transactio than the value of the transactio itself. It is also create the recursive problem where the users who verified the first transactio would need to verify that the payment they received was then also valid. Furthermor a high proportion of the original transactio value is spent as a transactio fee (for verificati which is not\u00a0effici These problems are avoided by combining multiple transactio and verifying them at the same time, broadcasti the successful verificati of multiple transactio simultaneo by grouping the transactio together into a block By confirming multiple transactio at once (and proving it using transactio fees can be aggregated (allowing each individual fee to be much lower). Each block includes a list of verified transactio a reference to the previous block, and a block ID. Incentivis The transactio verificati process outlined above is remarkable because it creates a demand for new participan to the network by creating a financial incentive to verify transactio This makes the network more secure as increasing the number of participan makes a sybil attack Summary Users generate new transactio and broadcast them on a peer-to-pe network An idle user listens for new transactio and collects them until the sum of all transactio verificati fees is greater than the cost the user will incur to verify them and meet the The idle user adds an extra transactio to their list of transactio that transfers the sum of the transactio fees to their own\u00a0addres The idle user generates the block of newly verified transactio referencin the previously verified block so that transactio can be ordered and completing the proof-of-w challenge. This new block is then broadcast to the\u00a0networ Other users are listening for new block announceme These users verify that the block is valid according to the proof-of-w requiremen and the order of the\u00a0blocks Users with unverified transactio look inside the verified block to see if their pending transactio have been\u00a0accep Competing to validate blocks Each user can choose which transactio they verify, and how many to verify before beginning the proof-of-w requiremen and hopefully collecting the transactio fees. This lack of order around transactio verificati is fine because the only way to increase the probabilit of being the first to claim the transactio fees associated with a collection of transactio (a block) is to spend more CPU power searching for the required partial If two users complete a block at approximat the same time then the blockchain will look different in different parts of the network, as each completed block begins to propagate and other users accept the new block and add it to their ledger. This is ok if a rule is enforced that requires a user to always accept the longest chain of\u00a0blocks. This works because if multiple blocks are created at the same time, the time it takes to create subsequent blocks will vary due to the random behaviour of the proof-of-w algorithm. Therefore chains of different length will always exist and one version of the block chain will be longer than the others, providing a clear candidate for which branch of the blockchain to use. If there are transactio in the discarded branch which are not present in the new (longest) blockchain then they are added back into the pool of transactio A block of transactio in never The above procedure for verifying transactio and adding new blocks onto the chain means that even if a user inspects a new block and sees that their transactio has been verified, its possible that in the future a longer chain will be discovered (which must be accepted) which doesn\u2019t include Therefore any block could potentiall be removed, which means a transactio is never completely verified. However the probabilit of a block being removed decreases as the number of blocks after it increases. This means verificati can be thought of in terms of the number of blocks that have been added to the chain after the block containing If you are willing to accept a high level of risk, or you trust the party you are transactin with you could opt for a small number of blocks to be added after the block containing your transactio This has the benefit of increasing the speed of the transactio verificati If the transactio is risky or high-value you might require a larger number of blocks to be added to the chain before accepting the transactio This will increase the time required to verify the transactio but reduce the probabilit that a longer chain will undo the block containing the transactio in\u00a0questio"},{"title":"Move","category":"Non-technical/Journal","url":"move.html","date":"8 July 2017","tags":"lifestyle ","body":"Ambition Since I was 15, one of my ambitions has been to be an entreprene I used to joke with my dad about buying him a nice boat one\u00a0day. Life so far has been predominan about education, and that stage is over now. I\u2019m walking away from what I\u2019ve come to see as a lifestyle and career that has too many it doesn\u2019t make sense to live like this. Ultimately I want to develop multiple sources of passive income. I want to\u00a0create. Summer\u00a0201 The last few months have been intense, with some weeks leaving me feeling ambitious and energetic, and others feeling anxious and overwhelme I need to get better at defining a goal, taking the quickest path there, and ignoring Ritsya is brave enough and imaginativ enough to force me to think big and consider how to live a better life. My biggest fear is that I screw it up, shooting myself in the foot and Ritsya and my daughter also. We were headed safely to an unremarkab existence and it would be terrible to swap that for something worse. That won\u2019t\u00a0happ I know that I make better decisions and produce my best work under pressure, I thrive when I\u2019m perceived as an underdog. I need to accept this without using it as a reason to be foolish. Whilst I don\u2019t have a clear plan, or direction, or goal, (I have many of them) I\u2019ve got skills and I want to see what I can make. I\u2019ll never do my best work, using my most productive combinatio of skills and experience if I\u2019m a cog in someone It\u2019s safer being an employee than self-emplo I guess the price of removing risks is the difference between the value you generate for your employer and what you\u2019re paid. I think those risks are overpriced and most people are more capable than they realise. Sometimes you have to walk into a situation to find out how to make the most of it. And sometimes you have to leave a place in order to Autumn\u00a0201 Last year my boss gave me some advice that was supposed to be encouragin I\u2019d requested to reduce my involvemen with some stuff that was unrelated to my job so that I could contribute more to my team and still maintain some semblance of a My boss was pretty clear that reducing my involvemen in the extra stuff would not be possible. During our conversati he advised me not to worry about how much I contribute to the team because \u201cthe work will get done anyway\u201d. This was meant to be encouragin but instead removed any conviction that the work I did was\u00a0import"},{"title":"Flee","category":"Non-technical/Journal","url":"flee.html","date":"4 July 2017","tags":"poetry ","body":"Summer children squeal and shriek, Watch wars. Excitement mounts. How exotic. Unexpected Death arrives. Grow-up Grow old Face death Evade evil. Don\u2019t tire. World is fragile. Towers burn, Bridges bludgeon, Markets stab. Got to get\u00a0out."},{"title":"Understanding VC\u00a0Investment","category":"Non-technical/Entrepreneurship","url":"investment.html","date":"9 June 2017","tags":"lis, fintech, startup, founder ","body":"I attended the Lisbon Investment Summit in June and wrote about my experience here. One of the best sessions was with Boris Golden called \u201cUnderstan Investment These are my\u00a0notes: VC\u2019s are seeking to identify high-poten startups, and then support and fund\u00a0them. They are looking for something that is innovative and\u00a0unprov A startup is not a company but an organisati searching for a Whilst executing and discoverin a scalable way to\u00a0grow. Startups need money for ambitious but credible growth\u00a0pla A typical stake for a VC could be roughly\u00a020 VC\u2019s want an exit price of at least 100m, otherwise their business models don\u2019t work\u00a0out. $10m can seem a lot for a founder with a 30% stake, but it\u2019s not enough to attract VCs, so aim\u00a0higher How to\u00a0pitch Identify specific people with real\u00a0needs Size of\u00a0market Why\u00a0now? What is your clear competitiv advantage - why can no-one else do\u00a0this? Market Find a large and Management Build a smart, skilled and cohesive\u00a0t With a strong ability to deliver quickly and to learn\u00a0quic That is ready to go big whatever it\u00a0takes With a unique vision Model Valuable and Efficient go-to-mark and Profitable Scalabilit Momentum Show traction and that you\u2019ve cracked the Show ambitous and credible growth\u00a0pla With a growth\u00a0mod And a clear strategy to scale and\u00a0win"},{"title":"How to be an ambitious founder in\u00a0Europe","category":"Non-technical/Entrepreneurship","url":"ambitious.html","date":"9 June 2017","tags":"lis, startup, fintech, europe ","body":"Whilst at the Lisbon Investment Summit I went to a session called \u201cWhat it means to be ambitious for founders in Europe\u201d by Oussama Ammar. I\u2019ve written generally about the summit here. Oussama spoke with passion and humour. It\u2019s clear that he cares about encouragin would-be founders and confrontin some of the cultural hurdles that exist in Europe. These are my notes from the\u00a0sessio If you are serious about building something that matters then you have a great and exciting future\u00a0ahe You can never predict who will be successful and who will fail. There are a lot of dumb successful people and clever founders can\u00a0fail. It\u2019s really important to learn how to leverage your time effectivel You can lose anything else and get it back but you can never recover the time you\u2019ve already\u00a0sp The only way to succeed is to try\u00a0hard. It\u2019s hard to do something that\u00a0matte In Europe you need to be more ambitious than average (measured against other countries) because the environmen makes it harder to be an entreprene (Attitudes to risk, comfort, security, expectatio of failure.. are all unhelpful for\u00a0founde It\u2019s not impossible just harder. So be Everyone is replaceabl no-one is\u00a0unique. You can always lose money and replace it. There is so much money in the\u00a0world. Take a look at Crunchbase and see the failure rate for companies that raised $1m - $10m, it is the same rate for those that raised\u00a0$10 If you lose money on a project you will learn things from that experience and you can leverage that\u00a0learn \u2026 But you will never get back the time you\u00a0spent. Zombies are companies that make enough money to survive but not enough to provide joy to the people in the company - aim high\u00a0and\u2026 Starting a company is a big deal, like starting a marriage. Think hard about what you will be doing, what problem you are trying to solve, who you will be working\u00a0wi In Europe there is not enough money to go around and this makes pitching hard. Silicon Valley doesn\u2019t have this\u00a0probl On average a startup in Europe will raise less than a startup in the USA. Europe is irrational concerned about risk. USA is less concerned with risk and this is very\u00a0helpf So be pragmatic about\u00a0risk Incorporat in London where the laws are good, even if you don\u2019t want to operate in the UK. It has a better designed system of corporate law. Estonia is also\u00a0good. Best European city for\u00a0founde London, There isn\u2019t a best city because none are\u00a0holist London has the\u00a0money Paris has engineerin and product developmen (and no-one outside Paris knows the Berlin has automation and execution, execution and\u00a0scalin \u2026You need to draw on all 3 of these cities, which when taken together can surpass Anecdotes The founder of Slack lives in Vancouver with his wife and kids. He spends 1 day each week in California and everyone thinks Slack is California due to good marketing. Forget national pride, be pragmatic. Leverage the\u00a0intern The founders of Airbnb would fly from New York to California for 1 day each week. They were originally in California where AirBnB started, but decided that if they could make it work in New York then AirBnB would be a success, so they moved to New York and flew back for meetings , etc. If you ask Europeans to come to a meeting in London (which is not as far as California is from New York) people start complainin and deciding they can\u2019t be\u00a0bothere"},{"title":"The Lisbon Investment\u00a0Summit","category":"Non-technical/Entrepreneurship","url":"lis17.html","date":"9 June 2017","tags":"fintech, startup, europe, lis ","body":"I attended the Lisbon Investment Summit on June 6 and 7 as part of the Oula.la InsurTech startup team with John Sullivan. It was the first startup conference I\u2019d been to and I arrived hoping to have some useful conversati and understand more about the Fintech and startup spaces in Europe. As usual I wanted to focus on tech, finance Who was\u00a0there As well as some outstandin sessions there was a high ratio of investors to startups. This made talking to VCs, bankers and M&A lawyers really easy. I was happy to meet a London-bas and VC when I sat down to have lunch, and thanks to interrupti a session to ask blockchain related questions on Day 1, a banker struck up a conversati with me over breakfast the next day. I met a lot of people across many relevant roles, and once I\u2019ve worked through my notes and my new collection of business cards I hope to have some great The most useful sessions for me\u00a0were: \u201cWhat it means to be ambitious for founders in Europe\u201d by Oussama Ammar from TheFamily \u201cUnderstan Investment by Boris Golden from \u201cBuilding successful businesses on blockchain technologi a panel discussion moderated by Kevin Loaec, founder of the Chainsmith blockchain consultanc and including Mir Serena Liponi from Blockchain The blockchain session was valuable because it\u2019s unusual to meet people who have been working in the blockchain space for several years. I\u2019m always wary of being distracted by the hype and noise around developmen in the blockchain space so I appreciate hearing some informed and The opening session included a speech from the Mayor of Lisbon and the next day opened with a speech from the Secretary of Industry. Both speeches conveyed an open and progressiv attitude to internatio cooperatio aimed at promoting and supporting founders and startups in Portugal. The warm words were supported by practical measures including tax incentives and a state-back scheme to match amounts contribute by private investors. It was really refreshing to hear a politician extol the virtues of multinatio cooperatio and bringing different cultures into Portugal. I wish Britain could do this\u00a0too. On the topic of Brexit, which inevitably arose due to London\u2019s present role as a centre of finance and innovation it seems Europe is still expecting the UK to come to its senses, and cannot understand why it\u2019s destroying its goodwill and reputation so\u00a0thoroug Overall The two days in Lisbon were full of useful and energetic conversati and it was a great experience pitching Oula.la multiple times and talking about what we\u2019re trying to do and how blockchain are a part of that. The opportunit to meet founders and investors are invaluable for making informed decisions. Lisbon is a beautiful city which I found easy to get around and always felt safe in. It\u2019s also refreshing affordable"},{"title":"Blogging with Pelican: Design, Plugins,\u00a0Sharing","category":"Technical/Web","url":"pelican2.html","date":"1 June 2017","tags":"pelican, jinja, python, twitter, facebook, blog, static site ","body":"Design My approach to building my blog is to keep it as simple as possible, only adding features when they make a significan improvemen to how the content is understood and used. Therefore I\u2019ve done away with several features that would normally come baked into a WordPress theme. For example a footer full of links that would never be used, and a sidebar full I opted for a single column design that hopefully presents text-heavy articles clearly and intuitivel (please leave a comment and tell me what you\u00a0think) Plugins My use of plugins to extend Pelican\u2019s functional reflects this, there is the Neighbors plugin so that the next or previous post can be accessed from the bottom of a post without going back to the index, and the Tag Cloud plugin to reflect which subjects are written about the most (and provide a link to all Speed The speed of the site is important because a faster site is more enjoyable to use. Therefore I\u2019ve minified the CSS and the JavaScript using the Assets plugin. I\u2019ve also set the CSS and JavaScript to load Images are optimised using the Optimize Images plugin so that their file size is as small as possible and they download quickly. The site uses CloudFlare free CDN features so hopefully no matter when you view the site from you get a decent page\u00a0speed I\u2019ve also arranged the homepage so that posts are shown by their category and then by posting date. This may not work very well with a larger number of posts, but I\u2019ll only consider that problem once it presents itself. Designing for hypothetic or conditions that don\u2019t yet exist is a waste of\u00a0time. There are examples of how I\u2019ve used Jinja templates below in the context of sharing my articles on Twitter and\u00a0Facebo Plugin: Share\u00a0Post I noticed that my posts were beginning to get tweeted about, so I thought it would be useful to have some sharing buttons at the bottom of each post for Twitter, Facebook and Email. Looking at the Pelican Plugins repo on Github showed there was (as usual) a plugin for this (called Share Post), though I noted it hadn\u2019t been updated for a couple of\u00a0years. Installing and initial set-up was simple thanks to the readme on the git repo. You need to copy the plugin folder to the plugins directory, and add the name of the plugin to the list in Then you need to copy paste some Jinja/HTML into the article.ht template. That\u2019s enough to make it\u00a0work. I noted though that when I shared to Twitter the text to be tweeted was encapsulat in quotes and there was a \u2018b\u2019 at the front. I realised this was due to using Python 3.x when the plugin (which hadn\u2019t been updated for 2 years) was likely written for Python 2.x. A quick google and the obligatory trip to SO showed me how to convert a bytes string to a normal text\u00a0strin # Python 2 tweet = ('%s%s' % (title, # Python 3 tweet_as_b = ('%s%s' % (title, tweet = I also found that an article couldn\u2019t be shared to twitter from a mobile device and this was due to the URL being incorrectl formatted. The new URL format required separate arguments for the URL, additional text and the # Incorrect twitter_li = % tweet # Correnct twitter_li = % (url, tweet, t_handle) Using meta-data to specify tweet\u00a0text I thought it would be cool to add some default text to a tweet, as I\u2019ve enjoyed this feature on other blogs when I\u2019ve found a post I wanted to share on Twitter. - A user may know they want to share an article but if they\u2019re in a hurry it might be hard to find the right words, so why not provide a ready-made tweet. The text is editable so it\u2019s only a\u00a0suggesti The text would be different for each post so it makes sense to specify it when writing the article. The article \u2018summary\u2019 would be too long, and I know that Pelican supports arbitrary meta-data tags. I assumed that Jinja would pick up the data the same way it picks up the \u2018standard\u2019 meta-data and added a function def tweet = '' if 'tweet'): tweet = return quote(('%s % else: return ' ' Once this function was working it was simply a case of calling the function and assigning the output to a variable I called \u201cTweet\u201d, and then adding \u201cTweet\u201d to the text string to be included in a tweets\u00a0tex tweet = tweet_as_b = ('%s' % tweet = t_handle = There was a bit of fiddling around to make sure that the number of spaces between each part of the tweet was correct, but nothing as complicate as when making Time Until. Specifying the text and image in a Facebook\u00a0s Sharing to Facebook worked without any formatting problems, but it bugged me that the opening text of the article was being used in the preview that was shared to Facebook when I had a summary already prepared and that would be much more useful to potential readers. For some articles I also had an article image that I wanted to see being\u00a0used Googling revealed that I needed to use particular meta tags in the webpage\u2019s header if I wanted to control what Facebook would pickup. Facebook uses the \u201copen graph\u201d standards so I would need the headers in my article pages to include the\u00a0follow `` `` `` `` `` ` I could see that I already had some meta tags being generated using the Jinja templates so I set about copy-pasti and modifying them to build the new tags. I had some issues with trailing white space or line breaks being included within the content string. This was solved like\u00a0so: {# Adding '-' after and before the %'s strips white space and line breaks #} {% block descriptio %} {%- if -%} {{ }} {%- else -%} {{ }} {%- endif -%} {% endblock descriptio %} I also needed to use some blocks more than once, because a descriptio tag was already included but Facebook wants an and Twitter wants a too. All three of these tags will include the same text (generated in the Jinja2 snippet above). If a block only needs to be used once then it\u2019s generated like\u00a0this; But If you call \u201c{% block descriptio %}{% endblock descriptio %}\u201d again Jinja will throw you an error. The documentat (and SO) reveal that the solution is to\u00a0use: This allows you to reuse blocks multiple times and keep your Finally, when I was testing Facebook to see if the correct text or image was being picked up I was initially frustrated to see that the new tags were not having any effect. This is because Facebook crawls your site and saves what it finds. If you want it to take a fresh look at your page with its new meta tags, you need to tell Facebook to crawl the page again, using the Facebook You can see the new sharing buttons below, please click them and see what\u00a0happe Note: My first article describing how I began to use Pelican is here"},{"title":"Blockchains from the ground up: Part\u00a01","category":"Technical/Cryptocurrencies","url":"blockchain-introduction.html","date":"25 May 2017","tags":"blockchains, digital currencies, distributed ledger technology, public key cryptography ","body":"How to maintain a reliable list across a small network without a This is part 1 of an introducti to the key features of a generalise blockchain I haven\u2019t included references to Bitcoin or any particular digital currencies or blockchain This is because a digital currency is just one applicatio of Create a financial document that cannot be forged or\u00a0dispute Let\u2019s imagine there is a village somewhere where people still trade by bartering. John has some apples whilst Lizzie has some oranges. John would like an orange, and offers Lizzie an apple in exchange. She accepts, and writes John a\u00a0receipt. Date: 1234 From: Lizzie To: John What: 1 Orange Price: 1 Apple So far, so good. The receipt is evidence of the transactio The next day John wants an orange but doesn\u2019t have anything to exchange. He offers to write Lizzie a note saying he owes Lizzie 1 orange (an IOU). They think about this and agree that John should sign the note so that Lizzie can prove that John owes her 1\u00a0orange. Date: 1234 From: John To: Lizzie What: 1 Orange Signed: John's signature, Lizzie's signature This IOU is a nice gesture, but it\u2019s simple to forge. Lizzie has the only copy of the IOU and once Lizzie has seen Johns signature, she could easily copy it and create more IOU\u2019s. She could also change this IOU from 1 orange to 11 oranges (for example) and John couldn\u2019t prove what the original amount was. If Lizzie and John disagreed over what was owed it would be impossible to know who was telling the truth. It\u2019s one person\u2019s word against the\u00a0other. Lizzie realises this and suggests an improvemen - they will find a witness and make 3 copies of the IOU. Each copy will be signed by Lizzie, John and the Witness. Lets call this Date: 1234 From: John To: Lizzie What: 1 Orange Witness: Walter Signed: \"John's signature\" \"Lizzie's signature\" \"Walter's signature\" This is a much stronger document and is more difficult to forge. If Lizzie changes the \u201cWhat:\u201d to \u201c11 Oranges\u201d, both John and Walter will have copies of the original with her signature on it. It will be 2 pieces of evidence against Lizzie\u2019s 1. Lizzie will be laughed out of court.\u00a0Hah 3 Party transactio work pretty well, and this is how most transactio are recorded today. But there is a weakness: If Lizzie can bribe Walter then the transactio can be falsified! John would rely on Walter to verify his version of the transactio but would be let down by Walters lack of integrity. Lizzie and Walter could change 1 orange to 11 oranges and if Lizzie offered Walter some of the extra oranges this would give them both an incentive to forge the documentat If Walter liked oranges enough, he might not care that his career as a witness will be\u00a0ruined. This is a problem for modern financial systems and a great deal of time, money and regulation is devoted to trying to ensure that third parties are trustworth E.G. If I buy a car and my bank is in cahoots with the car dealership I could be defrauded. Reducing this risk to an acceptably low level makes financial services slower and more expensive than they would otherwise need to\u00a0be. The solution is public-key infrastruc (which is introduced in my previous post). In this system, each individual generates their own public-pri key pair. They keep their private key private and make their public key freely available. A detailed descriptio of public-key cryptograp is out of scope for this post, but\u00a0briefl A public key is derived from a private key, and this pair together have a set of unique mathematic properties Either key can be used to encrypt a message but only the other key can be used to decrypt it. You cannot use the same key to encrypt and decrypt a message. If the private key is used to encrypt then anybody can decrypt (because the public key is publicly available) and whilst this is clearly a terrible way to keep a secret it\u2019s a great way to verify who encrypted the message, because only one person has the private key. Because of this, using a private key to encrypt a message is effectivel creating a digital signature which cannot be forged. (If the public key is used to encrypt a message then only the private key can be used to decrypt it, and this approach is used to transfer secret Back to the fruit. If Lizzie wants to accept John\u2019s IOU she can use public-key cryptograp and no-one needs to worry about Walter. There are 3 steps to 1] Create the IOU stating that John owes Lizzie 1\u00a0orange. Date: 1234 From: John To: Lizzie What: 1 Orange 2] John creates a public private key pair and encrypts the IOU using his private key. He adds an unencrypte \u201cFrom\u201d\u00a0lin From: John Date: 1234 To: Lizzie, What: 1 Orange <- Signed and encrypted by John using his private key 3] John makes his public key freely available to anyone who wants\u00a0it. This will work because anybody (not just Lizzie) can check that John signed the IOU. The transactio can be verified by looking at the \u201cFrom\u201d part of that transactio noticing that this transactio is supposedly from John and then using John\u2019s public key to decrypt the encoded The signature can only be decrypted using John\u2019s public key if his private key was used to encrypt it. Because John is the only person with his private key, that proves the transactio is valid, and Lizzie isn\u2019t dishonestl creating a debt for John to\u00a0pay. Clearly if John discloses his private key (or its stolen) then he will make the system insecure, but this is a problem with John and his security protocols, not with Create and maintain an accurate list So far we\u2019ve seen how 1 IOU (for an orange) can be securely created, signed and verified. This process can extended to be used by more people to exchange more fruit. For\u00a0exampl The original\u00a0n From: John Date: 1234, To: Lizzie, What: 1 Orange <- Signed and encrypted by John using his private key Then some From: Lizzie // Date: 1235, To: John, What: 2 Apples <- Signed and encrypted by Lizzie using her private key From: John // Date: 1236, To: Chris, What: 1 Banana <- Signed and encrypted by John using his private key From: Chris // Date: 1237, To: Lizzie, What: 2 Bananas <- Signed and encrypted by Chris using his private key After these 4 transactio between John, Chris and\u00a0Lizzie John owes 1 orange to Lizzie and 1 banana to\u00a0Chris Lizzie owes 2 apples to\u00a0John Chris owes 2 bananas to\u00a0Lizzie. This is confusing, (and ridiculous It is not possible to know who is the most in debt or who is the most wealthy. Lizzie owes 2 apples, but is owed 2 bananas and 1 apple. Does that mean her fruit business is losing money or making money? We cannot say. To be able to know we need to use the same unit of value for all the fruits. Lets say that an orange is worth 2 apples, and a banana is also worth 2 apples (therefore 1 banana = 1 orange.), also lets invent a currency called \u201ccoins\u201d and say 1 apple is worth 1 coin. The 4 transactio can now be rewritten\u00a0 From: John // Date: 1234, To: Lizzie, What: 2 coins <- Signed and encrypted by John using his private key From: Lizzie // Date: 1235, To: John, What: 2 coins <- Signed and encrypted by Lizzie using her private key From: John // Date: 1236, To: Chris, What: 2 coins <- Signed and encrypted by John using his private key From: Chris // Date: 1237, To: Lizzie, What: 4 coins <- Signed and encrypted by Chris using his private key By going through the list of transactio we can see\u00a0that: John owes Lizzie and Chris 2 coins each, and is owed 2 coins from Lizzie. His net amount is\u00a0-2 Lizzie owes John 2 coins but is owed 4 coins from Chris. Her net amount is\u00a0+2 Chris owes Lizzie 4 coins but is owed 2 coins from John. His net amount is\u00a0-2 So far Lizzie is the only person who appears to have any What if Lizzie wanted to use the 4 coins that she is owed by Chris to buy something from John? Could she use this system to transfer Chris\u2019 promise to pay her 4 coins so that Chris would pay John instead? The fact that everyone can be sure that the record of the transactio is accurate and authentic allows a debt to be used as payment. Lizzie\u2019s transactio would look like\u00a0this: From: Lizzie // Date: 1235, To: John, What: ba781... <- Signed and encrypted by Lizzie using her private key The \u201cWhat\u201d section contains a hash of the original transactio (with Chris) that she wants to transfer to John. A hash is the signature for a file or some text and in this case it is the signature for Lizzie\u2019s transactio with Chris. The signature is unique to each transactio and identifies which transactio is being used as payment. Because both transactio are signed using Lizzie\u2019s private key, it is simple to verify that Lizzie has the right to use this previous transactio where she is owed (or paid) some coins as payment to This shows how public-pri key infrastruc can be used to securely record transactio and enable trade between a group of people, - under certain conditions Blockchain can be used to transfer units of value like in this example, but we could just as easily put selfies or certificat of ownership (for houses, financial instrument diamonds, etc) inside the \u201cWhat\u201d part of the transactio If we make two other changes - removing the \u201cTo\u201d part of the transactio and including a hash of the transactio as part of the text which is signed using a private key. If we do this, then a record would\u00a0be: From: Chris // Date: 2345, What: \"A photo of me\" <- Signed and encrypted by Chris using his private key This would create a reliable record of what Chris claims he looks like. He can confidentl send anyone this record and if they have his public key then they can verify that it is Chris himself who signed it and is asserting that the photo is him. If somebody changed the photo then the data in the transactio would change and the transactio will have a new hash value. The new hash value will not match the hash value contained within the signature, and the text in the signature cannot be changed because it can only be encrypted using Chris\u2019 private key, which only Chris has. Therefore it will be simple to show that someone other than Chris is trying to change the\u00a0photo. Another use for public-key cryptograp arises if Chris were an employee in a bank, and the \u201cWhat\u201d contained documents about a customer the bank is providing financial services for. In this scenario, Chris (represent the bank) is effectivel confirming the customer\u2019s true identity and documentin the evidence that\u2019s been collected to show that the bank knows who their customer really is. If the transactio included a new section called \u201cCustomer ID\u201d (for example) then a database of all customers whose identity checks have been successful completed can be made. This can be shared with other department (or banks) easily and immutably. This is the concept behind KYC on a\u00a0blockcha Back to our fruit traders: At the moment a participan is allowed to carry a net negative balance. For this system to work in reality, the creation of \u201ccoins\u201d will need to be controlled in order to maintain their value. In the example above, people can freely create \u201ccoins\u201d and can also carry negative amounts of \u201ccoins\u201d. This would result in the value of a \u201ccoin\u201d plummeting Therefore their creation (and conversion from fruit) must be controlled in a Our examples so far only include 3 people. If there are a lot of people in the network it wouldn\u2019t be feasible to insist that everyone is present or online each time a new transactio is added to the list (the chain) of transactio However if we allow transactio to be added whilst some people are offline we create an opportunit for fraud. The reasons why, and the solution to this and other problems will be described in part 2."},{"title":"London Rent vs. London\u00a0Salaries","category":"Non-technical/Journal","url":"london.html","date":"12 May 2017","tags":"london, rent ","body":"Living in London is expensive, everyone in the UK knows this. Everyone knows that this is mostly due to the price of property, which we all enjoy talking\u00a0ab I\u2019ve lived here for 3 years now, and when my wife and I were both working we lived comfortabl Two incomes under one roof is just fine. Not as fine as in many other cities, but reasonable Just over a year ago my first child was born, and now that my wife\u2019s maternity pay has ended we are a single income household. This is not\u00a0fine. Even though I earn almost \u00a350k/year, and any salary increase will be taxed at 40%, I cannot cover essential day-to-day costs. I\u2019m in the 40% tax bracket, and my basic monthly outgoings exceed my monthly take home pay by \u00a3400. If my employer wanted to pay me enough to cover essential costs, it would cost them approximat double the shortfall due to tax. None of this makes\u00a0sens Salaries and living expenses have become For most people the most obvious way to increase wealth is to buy property. Mortgage repayments are cheaper than rent, and the value of the property increases over time. Double win. But there is a big hurdle to overcome before this is possible - saving money for a deposit on a house is often impossible without external help (i.e. the \u201cBank of Mum and Dad\u201d), because so much of the money earnt must be spent on rent\u00a0first It\u2019s a trap. Repayments on a mortgage are cheaper than paying rent, but you cannot save enough for a so much of your salary goes to paying rent1,\u00a0and the average cost of a flat in London is 500k2, which means a 5% deposit is \u00a325k \u2014 which is a higher deposit than you would need anywhere else in the UK, despite being the place you are least able to accumulate What if my partner went back to work and we put our baby in childcare? Childcare is not cheap, and working would need to bring in more than the cost of childcare. This requires a higher than average salary from a graduate job, (which is the life stage when people might reasonably start having children), so unless both parents are unusually high earners that option isnt viable\u00a0eit Living in London is only financiall possible if you are either single or household income is more You\u2019d be lucky to find a 1 bedroom flat for less than \u00a31000/mont According to Rightmove, an estate agent with some useful price statistics"},{"title":"Introduction to the \u00c6ternity blockchain\u00a0project","category":"Technical/Cryptocurrencies","url":"aeternity.html","date":"5 May 2017","tags":"\u00e6ternity, dlt, digital currencies, finance, blockchains ","body":"These are my notes on the \u00e6ternity blockchain project, I\u2019m not affiliated with the \u00e6ternity\u00a0t \u00c6ternity is a new blockchain project that is pre-launch The headline goal is to securely facilitate large volumes of which interface with external data sources. This is made possible via a decentrali oracle based on prediction markets. These terms are explained below. The \u00e6ternity project has proposed several notable Smart Contracts Oracles and native Governance by Written in\u00a0Erlang Different types of\u00a0node Sharding Smart Contracts A smart contract is a way to execute a contract without an intermedia (middle-ma and The smart contract is a protocol which is stored and executed on a blockchain executing transactio (outputs) based on specific inputs and programmab logic automatica The logic often mirrors that contained in clauses of a State channels are payment networks that exchange funds off-chain and periodical settle up accounts with the main blockchain (The Bitcoin Lightning Network is creating a system for routing Bitcoin payments through State channels increase scalabilit by making groups of transactio independen of each other. This allows them to be processed in\u00a0paralle \u00e6ternity proposes executing in state channels (Turing complete means, colloquial real-world and general purpose), which should allow greater volumes of transactio and make the smart contracts more secure and easier to\u00a0analyse This is because executing the off-chain makes them private and the code used to execute the smart contract won\u2019t need to be broadcast to the primary blockchain This should increase processing capacity by allowing contracts to execute in\u00a0paralle Disadvanta to the state-chan approach include reduced transparen as running smart contracts in state channels requires more trust in both the contract creator and the node running\u00a0it Oracles and The Oracle functional allows to interact with data outside the \u00e6ternity blockchain This is possible by checking on-chain prediction market results and rewarding users who made the correct prediction Users are rewarded through automated payments and the immediate recording of transactio in the blockchain This creates incentives to participat in prediction markets, which have been shown to be\u00a0effecti On-chain, rather than off-chain allows greater efficiency The prediction market is expected to run using a native (on-chain) consensus procedure. The oracle mechanism is designed to use the same Governance by Oracle functional compliment prediction Prediction markets are proposed to implement governance of the \u00e6ternity blockchain This is a new\u00a0approa The \u00e6ternity protocol would be governed by user input. A prediction market will exist where changes to features and protocols would result in a higher token\u00a0valu The incentive to increase the value of a token (\u00c6on) would allow the \u00e6ternity community to decide efficientl which changes to\u00a0impleme Low level protocol changes to variables like block times and block capacity could be\u00a0possibl The consensus developed by the prediction market will initially provide input to the developmen Later, a fully autonomous prediction market for governance is expected (a DAO) Written in\u00a0Erlang Erlang is normally used for large-scal systems that manage the allocation of scarce network resources (telecoms, banking, Could make it easier to run a lightning network and process many state-chan in\u00a0paralle As far as I know, \u00c6Ternity is the first blockchain project to be written in\u00a0Erlang Different types of\u00a0node The \u00e6ternity network will contain nodes with different functions. Each type of node will contribute towards the efficient functionin of particular aspects of the\u00a0networ Node types will\u00a0inclu Liquidity - Lots of channels and lots of users. Open a channel to issue a contract, for a\u00a0fee. Exchanges - Trustless exchanges of assets offered by market makers. Profitable to market makers because of Presumably features such as consensus algorithms and prediction markets will also require their own dedicated node types. Users of the node will incur transactio fees to participat providing an incentive to run a\u00a0node. Sharding Allows a greater transactio volume, removing scalabilit problems that Bitcoin and Ethereum Sharding works by splitting the space of possible accounts into subspaces (for example based on the first digit of a Each shard gets a set of validators Each validator validates 1 shard\u00a0only Contracts and transactio within the same shard work as\u00a0normal Contracts and transactio across shards require alternativ techniques based on"},{"title":"The \u00c6ternity ICO: My\u00a0experience","category":"Technical/Cryptocurrencies","url":"aeternity-ico.html","date":"30 April 2017","tags":"fintech, finance, digital currencies, blockchains, dlt, \u00e6ternity, ico ","body":"On April 3rd, I happened to be Googling digital currencies and blockchain innovation when I came across the \u00c6ternity website and skimmed their white paper. The project is ambitious, like many crypto projects, but seems well organised. The team is well known in the space. There is a clear plan to develop the project and create a blockchain technology that, if successful could bring. A step change in the use of digital currencies for high volume low value transactio and the viable implementa of The ICO To my surprise, I realised that phase 1 of the Initial Coin Offering (ICO) was about to begin, and if I wanted I could acquire the rights to \u00c6ons (the \u00c6ternity token). During phase 1, 1 ETH would purchase 1100 \u00c6ons. In early April 2017, 1 ETH was worth about\u00a0\u00a338. I was willing to make a small and risky investment but in order to do that I would need to work out how to convert my convention Sterling into Bitcoin or Ether, in order to then purchase \u00c6ons. The \u00c6ternity website made it super easy to set up an Etherium wallet, and to use that wallet to invest in the \u00c6ternity project, but buying Ether immediatel and putting it in my new wallet proved to Helpfully, the \u00c6ternity project had partnered with the Swiss firm \u2018Bitcoin Suisse AG\u2019 who would directly convert to \u00c6ons from fiat currencies cutting out the need to purchase an intermedia digital currency. However once I\u2019d completed the identity checks and signed, scanned and sent the multiple forms, I realised I\u2019d need to pay a \u2018signing on\u2019 fee of about \u00a370. To Bitcoin Suisse\u2019 credit though, they did manually approve my identifica and contract within an\u00a0hour. At the time, I thought the project would still be a good investment even with this extra cost, but I determined to exhaust all other possibilit first. I\u2019m familiar with cryptocurr wallets and public/pri keys due to some previous research, so I was able to immediatel begin trying to set up an account with an\u00a0exchang Exchanges What followed was a fairly chaotic few hours where I would sign-up to several exchanges and see how close I could get to purchasing either Bitcoin or Ether immediatel before realising I either had to wait 48 hours for security clearances or provide additional details, or wait for manual verificati of my scanned By the end of the evening I had a rough idea of which usernames, passwords and (small) sterling amounts had been submitted to each\u00a0excha After a couple of false starts, I used a combinatio of the Coinbase desktop website and their iOS app to purchase ETH up to their weekly limit, and then used multiple cards to increase my holding of ETH. The Coinbase app would bug out when processing a debit card payment and verifying the card details with the bank. This had initially led me to try other exchanges, where I would hit other roadblocks and delays. CEX.IO, for example, doesn\u2019t allow you to make trades in the first 48 hours (IIRC) after registerin which is reasonable enough unless you\u2019re in a\u00a0hurry. Conclusion Once I had a few Ether to my name, the rest of the process was simple. I was delighted to visit Etherscan. and view the details of my Etherium - \u00c6ternity transactio almost immediatel This gave me a lot of confidence that I hadn\u2019t sent money into a void, and was a nice contrast to my (successfu experience buying bitcoin in early\u00a02014 Finally, the simple tool to check your \u00c6 balance at the bottom of Eternity\u2019s contributi page assured me that I\u2019d made my investment successful To me, the \u00c6ternity project stands as an exciting endeavor seeking to solve some widespread and highly valuable technical challenges I hope they\u2019re successful and wish them\u00a0well."},{"title":"Blogging with\u00a0Pelican","category":"Technical/Web","url":"pelican.html","date":"28 April 2017","tags":"pelican, blog, python, static site ","body":"When I began blogging in 2016, I became more aware of how blogs are designed. Many of my favorting blogs had simple designs which made it easier to focus on the content, and they loaded really fast. (E.g. and CuriousGnu I wanted this for my blog, too. I\u2019d used Wordpress to build and publish my blog which was a great way to begin, but I felt I was compromisi on its design and functional I wanted to have control over my This led me to static sites which contain only fixed content and are faster to load and easier to design than one built using a dynamic blogging platform such as Wordpress. Because I was already familiar with Python I chose Pelican rather than another static site generator such as\u00a0Jekyll. There are plenty of sites to tell you how to start blogging in Pelican, so here I will focus on my experience after the initial set-up. When I was learning how to begin, I found Amy Hanlons blog particular useful and\u00a0clear. The learning\u00a0c \u2026 was longer than I expected. Since setting out to switch from Wordpress to Pelican, I\u2019ve taught myself enough of the following tools to hack this site together. I\u2019m really happy about this because these tools could be used in future projects\u00a0t HTML I find HTML quirky but intuitive. Tags make sense, comments are laborious and learning by google is relatively quick. Whatever it is I\u2019m trying to do (like add a link to jump back to the top of the page) someone will have posted the CSS Writing CSS feels a lot more concise that HTML but it also felt impossible to learn without taking a step back and reading an introducto course. Usually I learn by hacking new phrases together from existing examples so it was frustratin to go backwards before progressin There was a lightbulb moment when I realised CSS Selectors were a thing, and realising CSS files get called in the header (usually) of an HTML\u00a0file\u2026 I ended up using a trial subscripti to Thinkfuls Front-end developer course, which is pretty good at explaining how CSS is structured and how to arrange content on a page. If I still had access, I\u2019d be completing the second half of the course\u00a0:) Jinja is a tool written in Python to create HTML pages. It doesn\u2019t look intuitive to me, but I\u2019ve been able to get enough done by copy-pasti similar snippets from other parts of the theme I started with (Thanks Molivier!) to make the changes I wanted. I\u2019d like to learn more as it seems Pelican To build a website using Pelican, you need to run commands from the terminal. There are various commands but I found myself using only a few regularly. will generate a project skeleton to get you started. \u201cmake devserver\u201d will initialise a local server and generate output files so that I can view changes locally before uploading (its opposite is \u201cmake stopserver Finally \u201cpelican content -s generates the html and css for remote hosting. Some of the plugins I use such as \u201cAssets\u201d which minifies the CSS only work when publshconf is called, which confused me initially as I didn\u2019t think it was working when I was only using Git This really challenged me, and I still don\u2019t feel like I know what I\u2019m doing. Git is far more powerful than I need it to be, when all I want to do is undo some erroneous edits and upload a new version of the site to\u00a0Github. I can stage and commit files, I can create local and remote repo\u2019s from the command line. I can change a remote\u2019s URL, reset a repo and force a push or a pull. That\u2019s all. I haven\u2019t tried to merge or to create a test branch, and if some part of the process goes wrong it usually takes hours to make it right\u00a0agai This is one tool for which the awesome SO and Google cannot magic up the exact right answer For example, there is still an output folder in the source repo that is\u2026 mysterious to me. Its not the real output, its a version frozen in time from a few weeks ago, and it has an \u201c@\u201d in its name. I don\u2019t know how it got there. It was created one afternoon in a blur of frustrated google queries and I find git\u2019s commands the least intuitive of all the tools I use, with its preceeding single dashes and double dashes, and random words thrown into the middle of otherwise But Git is ubiquitous and Github is awesome, so I will learn\u00a0it. Github pages with an external URL You\u2019ll need to add a file called CNAME to the repo containing the output. CNAME should contain the address of your site in You\u2019ll also need to update the DNS records of your domain name to point the name at Github\u2019s servers. For Github, you need two \u201cA Records\u201d with host type \u201c@\u201d and values and respective You also need a CNAME record with host type \u201cwww\u201d and the value equal to It took about 12 hours for the changes to propagate, and during that time I had variable success loading the\u00a0site. Plugins One thing I didn\u2019t want when moving away from Wordpress was a site bloated with features that didn\u2019t make the content easier to read. However I found I still needed a few plugins to optimise my site and provide some basic functional that doesn\u2019t come with the Super useful, as all I need to do to publish a notebook as a webpage is copy the .ipynb file into the context directory and add a sidecar .ipynb-met file with standard meta data. This functional is one of the main reasons why Pelican is popular with data bloggers. (Though Nikola is Neighbors At the end of a post there should be a link to the previous and next blog posts - I was surprised this wasn\u2019t included as standard. After putting the plugin in the plugins folder and updating you need to copy a couple of jinja snippets into a template, and maybe add some css to make the links look\u00a0nice. Make those images as small as possible to help make the site as fast as possible. Add the plugin, update and thats\u00a0all. Assets Before I started working with Pelican, minifying css and JavaScript would have been too advanced. But once Pingdom and Google Pagespeed started criticisin me for my multiple .css files, I accepted the\u00a0challe Conclusion I\u2019m super happy wth the websites design and speed. It\u2019s designed the way I want it, and I\u2019ve learnt a ton of useful stuff along the\u00a0way. Update: My second post about blogging in Pelican is here."},{"title":"Analysing a personal\u00a0library","category":"Technical/Data","url":"books.html","date":"1 February 2017","tags":"data analysis, python, isbn ","body":"A friend of mine has collected books for many years and has recently begun to catalogue them. In this post, I show some simple analysis of the catalogue and then query an ISBN database to fill in some missing\u00a0da In\u00a0[1]: from import HTML function code_toggl { if (code_show } else { } code_show = !code_show } $( document window.onl = function() //time is set in millisecon 10000) }; ''') Out[1]: function code_toggl { if (code_show } else { } code_show = !code_show } $( document To toggle the visibility of the code blocks, click here. Set-up and data some settings and import some\u00a0packa In\u00a0[2]: # Display plot results inline, not in a separate window %matplotli inline %pylab inline # Set the size of all figures = (14, 5) import pandas as pd import re import bibtexpars import numpy as np import as plt Populating the interactiv namespace from numpy and matplotlib Load the In\u00a0[3]: table = table = table[0:91 df = table orig_rows = (df.shape[ print(\"The are %d rows in the catalogue\" % (df.shape[ ) There are 9187 rows in the catalogue Data formatting and tidying:\u00b6V the top 5 rows to see how the data is arranged and how many cells are\u00a0comple In\u00a0[4]: df.head() Out[4]: Location Subject Title Author Publisher ISBN? Shelf Pages Price Value Date 0 HR Islam The Islamic Invasion R Morey Harvest HP 1960 0 89081 983 1 17cm 221 3 8 2008-04-01 00:00:00 1 HR Word lists New Testament Word Lists Morrison & Barnes Erdmans 1975 0 8028 1141 8 NaN 125 3 NaN 2008-04-01 00:00:00 2 HR Theology: Salvation The Triumph of the crucified E Sauer Paternoste 1952 NaN NaN 207 3 10 2008-04-01 00:00:00 3 HR Early Fathers Ante-Nicen Christian Library Ed Menzies T&T Clark 1897 NaN NaN 533 9 NaN 2010-03-01 00:00:00 4 HR Apologetic Earth\u2019s earliest ages G.H. Pember H & S 1895 NaN NaN 494 3 NaN 2008-04-01 00:00:00 Set float format to two decimal places (currency) Not all rows can become a\u00a0float: In\u00a0[5]: = def to_number( try: s1 = return s1 except ValueError return s df.Price = f : to_number( df.Value = f : to_number( Find and remove blank\u00a0rows In\u00a0[6]: # How many rows are all NaN values df = # drop a row only if ALL columns are NaN print('%d row removed ' % (orig_rows - df.shape[0 ) # 1 row contained all NaN and has been removed 1 row removed List the number of rows in each column which are\u00a0empty: In\u00a0[7]: # How many rows in each column are NaN Out[7]: Location 29 Title 34 Publisher 175 Shelf 336 Pages 540 Author 915 Price 3611 ISBN? 4770 Date 5712 Subject 6208 Value 9179 dtype: int64 Based on these results, title and publisher are the most Split a column containing two types of\u00a0data: The \u201cPublisher column contains both the publisher and the year it was published. This should be split into two\u00a0column In\u00a0[8]: = None # default='w df['PubYea = expand=Tru # regex is confusing = expand=Tru Improve the format of the \u2018Date\u2019\u00a0col In\u00a0[9]: df['Date'] = The data frame is now in the columns I want it to be in, and the top 5 rows\u00a0are: In\u00a0[10]: df.head() Out[10]: Location Subject Title Author Publisher ISBN? Shelf Pages Price Value Date PubYear 0 HR Islam The Islamic Invasion R Morey Harvest HP 0 89081 983 1 17cm 221 3.00 8.00 2008-04-01 1960 1 HR Word lists New Testament Word Lists Morrison & Barnes Erdmans 0 8028 1141 8 NaN 125 3.00 NaN 2008-04-01 1975 2 HR Theology: Salvation The Triumph of the crucified E Sauer Paternoste NaN NaN 207 3.00 10.00 2008-04-01 1952 3 HR Early Fathers Ante-Nicen Christian Library Ed Menzies T&T Clark NaN NaN 533 9.00 NaN 2010-03-01 1897 4 HR Apologetic Earth\u2019s earliest ages G.H. Pember H & S NaN NaN 494 3.00 NaN 2008-04-01 1895 of books by The bar chart below shows how many books in the library were published in a given decade. The list below shows the 5 oldest\u00a0boo In\u00a0[11]: = '{:,}'.for df.PubYear = fig = // 10 * logy = False) of Publicatio of Books\") fig Out[11]: at 0x10501f2b View the 5 oldest\u00a0tit In\u00a0[12]: df['PubYea = df2 = != 0.0] Out[12]: Location Subject Title Author Publisher ISBN? Shelf Pages Price Value Date PubYear 4753 Lib NaN In Christ\u2019s own country Dom Ernest Graf Burns Oates NaN Sh.4.5 302 Gift NaN 1999-07-30 1037 3043 StM NaN The First Epistle of Peter C.E.B. Cranfield SCM Press interestin Sh.5.5 128 NaN NaN NaT 1050 4574 Lib Music Score Easy-Play Speed Music; waltz clas NaN Sight & Sound NaN Sh.4.4 47 99P NaN NaT 1076 7184 25A NaN The Noble Qur\u2019an transl Al-Hilali & Khan Madinah NaN Sh.3.6 956 3.0 NaN 2010-12-10 1417 3296 Lib NaN Chained Bible NaN Chris Barker Very incomplete Sh.1.1 NaN NaN NaN NaT 1585 List the number of books in each\u00a0locat In\u00a0[13]: df3 = df = Out[13]: Location 25A 3411 LIB 2457 CH 1088 ST9 1000 STM 886 HR 305 NAN 29 ST.M 3 HR 2 ST M 1 SA 1 LB 1 CH 1 :LIB 1 dtype: int64 Create a list of the differnet subjects, order the list by the most In\u00a0[14]: df4 = df df4[\"Subje = Out[14]: Subject NAN 6208 COMMENTARY 61 LOCAL HISTORY 58 SERMONS 41 THE CENTURY BIBLE 37 THEOLOGY 36 CHRISTIAN BIOGRAPHY 34 BIOGRAPHY 31 NT COMMENTARY 31 HEBREW GRAMMAR 30 SACRED BOOKS OF THE EAST 30 POETRY 29 CLARK'S FOREIGN THEOL LIB 28 GENERAL EDIT ANTONIA FRAZER 25 NOVEL 24 WRITERS AND THEIR WORK 24 CATALOGUE 22 AUTOBIOGRA 20 OT COMMENTARY 19 GREEK 19 THE EXPOSITOR' BIBLE 18 SRIMAD BHAGAVATAM 18 THE BABYLONIAN TALMUD 18 PHOTOGRAPH 17 GREAT MUSEUMS OF T WORLD 15 CHURCH HISTORY 14 DAILY READINGS 14 CHRISTIAN LECTURES 14 NOTES ON THE CATHEDRALS 14 INTERNATIO CRITICAL COMM 13 ARAMAIC 13 THE CLARENDON BIBLE 13 FICTION - CADFAEL 13 CLARK'S FOREIGN THEOL LIB. 13 APOLOGETIC 12 PSALMS 12 THE CAMBRIDGE BIBLE 12 PRAYER 12 LIFE LIBRARY OF PHOTOGRAPH 11 COMMENTARY ON HOLY SCRIPT 11 THE MASTERPIEC LIBRARY 11 COMMENTARY ON THE O.T. 10 HYMNS 10 POEMS 10 MYSTICISM 10 DICTIONARY OF THE BIBLE 10 EXHIBITION CATALOGUE 10 POETICAL WORKS OF TENNYSON 10 DICTIONARY 10 HISTORY 10 dtype: int64 Create a list of authors in the library. Order the list by number of\u00a0books: In\u00a0[15]: df5 = df df5[\"Autho = Out[15]: Author NAN 915 VARIOUS 31 BHAKTIVEDA S PRAB 19 C.H. SPURGEON 19 ELLIS PETERS 17 ED. RABBI I. EPSTEIN 17 LESLIE WEATHERHEA 15 ED CARLO RAGGHIANTI 15 ALBERT BARNES 14 JAMES HASTINGS 13 GEORGE ADAM SMITH 13 WILLIAM TEMPLE 12 JAMES MOFFATT 11 ED J A HAMMERTON 11 KEIL & DELITZSCH 11 WILLIAM BARCLAY 10 SHAKESPEAR 10 PETER ACKROYD 10 IAN WILSON 9 CHARLES DICKENS 9 BERNHARD WEISS 9 J.B. PHILLIPS 9 ED QUENNELL & HODGE 8 CHARLES GORE 8 H.V. MORTON 8 M.F. SADLER 8 VARIOUS AUTHORS 8 ED ANDREW LANG 8 GEZA VERMES 8 S.R. DRIVER 8 EVELYN UNDERHILL 8 \" 8 HENRY ALFORD 7 ALDOUS HUXLEY 7 ED JAMES HASTINGS 7 ED ARTHUR MEE 7 ROY STRONG 7 ALEXANDER MACLAREN 7 MARCUS DODS 7 JOACHIM JEREMIAS 6 THOMAS WRIGHT 6 ED R. CROMARTY 6 SUSAN GLYN 6 DAVID FOUNTAIN 6 WILLIAM WHISTON 6 C.S. LEWIS 6 BARRIE TRINDER 6 DAVID TRUMPER 6 G. CAMPBELL MORGAN 6 ALISTER MCGRATH 6 dtype: int64 Distributi of book length by number of\u00a0pages: In\u00a0[16]: def try: s1 = return s1 except ValueError return '' df.Pages = f : df6 = != ''] 2000, 100.0)) fig = range=[0, 2000]) of Pages\") of Books\") Out[16]: at 0x10585b3c Query an ISBN database to find missing\u00a0da Lastly, I thought it would be a fun challenge to fill in gaps in the data. The table below shows rows with ISBN number but missing either Author, Title or\u00a0Publish It turns out that there are only 10 rows that meet this criteria, and in all cases it is the publisher that is\u00a0missing In\u00a0[17]: df7 = & ((df['Auth == '') | (df['Title == '') | == ''))] df7 Out[17]: Location Subject Title Author Publisher ISBN? Shelf Pages Price Value Date PubYear 1430 ST9 BIOGRAPHY OF SCIENTIST Longitude (John Harrison) DAVA SOBEL 1 85702 571 7 Sh.1.2 184.0 5.99 NaN NaT 1998 2707 STM DEVOTIONAL Romans: Momentous News DAVID COOK 978 1 906173241 Sh.3.4 55.0 1.0 NaN 2011-07-28 2011 3874 LIB NAN Annie\u2019s Box - Darwin\u2019s daughter RANDAL KEYNES 1 84115 060 6 Sh.2.4 331.0 3.99 NaN 2002-07-20 2001 4705 LIB HISTORICAL NOVEL Galileo\u2019s Daughter DAVA SOBEL 1 85702 861 9 Sh.4.5 429.0 NaN NaN 2002-09-27 1999 5949 25A NAN Short Life Long Times of Mrs Beeton KATHRYN HUGHES 1 84115 373 7 Sh.1.4 525.0 \u00a32.50P NaN 2012-02-09 2005 6008 25A NAN Signs in the Sky (Birth of a New Age ADRIAN GILBERT 0 609 80793 5 Sh.1.5 329.0 4.0 NaN 2012-03-07 2001 6097 25A NAN Isaac Newton, the last Sorcerer MICHAEL WHITE 1 85702 706 X Sh.1.6 403.0 \u00a31.50P NaN 2012-02-09 1997 6663 25A NAN Live Wires - powerful stories of cha D. J. CARSWELL 978 1 906173 13 5 Sh.2.7 124.0 1.0 NaN 2011-06-29 2010 8640 25A KING ARTHUR QUINCENTEN One in Specyal ED SIDNEY HART 0 948485 00 0 Sh.5.9 145.0 \u00a32.49P NaN 2003-06-28 1985 8845 25A NAN The Order of St John - a short history E L EDMONDS 0 947718 07 9 Sh.5.7b 35.0 \u00a33.75P NaN NaT 1986 In\u00a0[18]: from isbnlib import * from isbnlib.co import * from import * import bibtexpars def SERVICE = 'isbndb' APIKEY = 'IZXL3ESD' # YOUR key APIKEY) # register your key bibtex = isbn = clean(isbn try: a = SERVICE)) return a except: return 'isbn is invalid' In\u00a0[19]: def get_pub(is bibtex_str = has_isbn(i try: bib_db = dic = return except: return In\u00a0[20]: df7['ISBN? = df7.Publis = f : get_pub(f) The table below shows the results of the isbnlib query. I thought it odd that all the \u2018missing\u2019 publishers names began with a number. It turns out that the regex method I used to split publisher name and year of publicatio into separate columns doesnt work when there are numbers in the publishers name. Rather than go back and correct this, I\u2019ll leave the script as it is to show how to use the In\u00a0[21]: df7 Out[21]: Location Subject Title Author Publisher ISBN? Shelf Pages Price Value Date PubYear 1430 ST9 BIOGRAPHY OF SCIENTIST Longitude (John Harrison) DAVA SOBEL Fourth Estate 1 85702 571 7 Sh.1.2 184.0 5.99 NaN NaT 1998 2707 STM DEVOTIONAL Romans: Momentous News DAVID COOK 10Publishi 978 1 906173241 Sh.3.4 55.0 1.0 NaN 2011-07-28 2011 3874 LIB NAN Annie\u2019s Box - Darwin\u2019s daughter RANDAL KEYNES 4th Estate 1 84115 060 6 Sh.2.4 331.0 3.99 NaN 2002-07-20 2001 4705 LIB HISTORICAL NOVEL Galileo\u2019s Daughter DAVA SOBEL Fourth Estate 1 85702 861 9 Sh.4.5 429.0 NaN NaN 2002-09-27 1999 5949 25A NAN Short Life Long Times of Mrs Beeton KATHRYN HUGHES None 1 84115 373 7 Sh.1.4 525.0 \u00a32.50P NaN 2012-02-09 2005 6008 25A NAN Signs in the Sky (Birth of a New Age ADRIAN GILBERT Three Rivers Press 0 609 80793 5 Sh.1.5 329.0 4.0 NaN 2012-03-07 2001 6097 25A NAN Isaac Newton, the last Sorcerer MICHAEL WHITE Fourth Estate 1 85702 706 X Sh.1.6 403.0 \u00a31.50P NaN 2012-02-09 1997 6663 25A NAN Live Wires - powerful stories of cha D. J. CARSWELL None 978 1 906173 13 5 Sh.2.7 124.0 1.0 NaN 2011-06-29 2010 8640 25A KING ARTHUR QUINCENTEN One in Specyal ED SIDNEY HART Three Golden Crowns 0 948485 00 0 Sh.5.9 145.0 \u00a32.49P NaN 2003-06-28 1985 8845 25A NAN The Order of St John - a short history E L EDMONDS s.n 0 947718 07 9 Sh.5.7b 35.0 \u00a33.75P NaN NaT 1986 if { var mathjaxscr = = = = ? \"innerHTML : \"text\")] = + \" config: + \" TeX: { extensions { autoNumber 'AMS' } },\" + \" jax: + \" extensions + \" displayAli 'center',\" + \" displayInd '0em',\" + \" showMathMe true,\" + \" tex2jax: { \" + \" inlineMath [ ['$','$'] ], \" + \" displayMat [ ['$$','$$' ],\" + \" true,\" + \" preview: 'TeX',\" + \" }, \" + \" 'HTML-CSS' { \" + \" linebreaks { automatic: true, width: '95% container' }, \" + \" styles: { .MathJax .mo, .MathJax .mi, .MathJax .mn': {color: 'black ! important' }\" + \" } \" + \"}); \"; (document. || }"},{"title":"FakeGL: A Synthetic General Ledger and Trial\u00a0Balance","category":"Technical/Data","url":"fakegl.html","date":"6 January 2017","tags":"data analysis, finance, journals, reconciliation, accounting, trial balance, general ledger, python ","body":"My work involves processing a lot of General Ledgers and I wanted to build and test various automation and analytical techniques to see how my workflow could be improved. In order to do that in a free and fun way, I would need fake data, so I set out to build a process to generate a fake General Ledger (GL) and a correspond Trial Balance (TB). Motivation and I didn\u2019t know how comprehens I needed the GL to be - modern systems are complex and store data for a wide variety of uses. I resolved to start with something simple and iterate for as long as I\u00a0wanted. The journals produced below satisfy the following general Each journal contains equal debits and\u00a0credit Opening and closing Trial Balances net to\u00a00 Profit and Loss (P&L) accounts start the year with 0 balance, Balance Sheet (BS) accounts do\u00a0not. Each transactio hits both the P&L and the BS (i.e. If an account on the P&L is credited, then the other side of the transactio is a debit to the BS) Distinguis between manual and The GL: Contains journals posted evenly throughout the year (this isnt realistic, but is a simple way to generate date\u00a0data) Receives journals Identifies if a journal is manual depending on which subledger it Records which user posted the journal if the journal is\u00a0manual The script below allows the user to\u00a0specify The number of accounts on the GL/TB The number of journals in the GL A mean and variance for the number of lines in each\u00a0journ A mean and variance for the functional amounts posted to\u00a0account How many different users post The beginning of the financial\u00a0 The criteria for a manual journal, based on\u00a0subledg The proportion of manuals which are\u00a0manual The proportion of accounts which hit the P&L or BS An arbitrary list of\u00a0subledg The Jupyter Notebook below shows the annotated Python 3 code I\u00a0wrote: Notebook set-up\u00b6Loa the various libraries used to easily add the required features. Two libraries to\u00a0note: Pandas is pythons ubiquitous data handling\u00a0t Faker is a useful tool to generate fake data, and is an easy way to bootstrap a\u00a0database In\u00a0[1]: from random import gauss from faker import Factory import random import numpy as np import time from datetime import timedelta import datetime from natsort import natsorted, ns import pandas as pd Choose parameters and values for the GL and TB\u00b6 In\u00a0[2]: # ****** Number of different accounts in the GL ********* x = 111 # ****** Number of journals in the GL j = 15713 # Setup posting date d0 = '20160101' # first day, data generated over 1 year. d1 = \"%Y%m%d\") # ****** Distributi of lines per journals ********** jl_1 = 21 # mean jl_2 = 10 # variance j_l = lambda x, y: # ****** Number of different users posting journals ***** fake = U = 10 ul = [] for _ in range(0,U) # ****** Functional amount values q1 = 700 # mean q2 = 104 # variance def q(q1,q2): p = < 0.5 # True implies if p: i = -1 else: i = 1 out = i * return out # ****** Proportion of journals which are manual ******** Mp = 0.23 # ****** Proportion of accounts that are P&L accounts *** Pp = 0.3 # ****** Subledger names ********* source_fee = if an account feeds into the P&L or BS: In\u00a0[3]: def if len(elemen > 0: return element[2] == 'P' return False def if len(elemen > 0: return element[2] == 'B' return False Generate account\u00a0co In\u00a0[4]: def b_names = [] p_names = [] a_names = [] p = 'ACP' b = 'ACB' for i in range(x): A = < Pp if A: y = else: y = if len(b_name % 2 != 0: del b_names[-1 if len(p_name % 2 != 0: del p_names[-1 a_names = b_names + p_names Generate journal names and\u00a0length In\u00a0[5]: def d0 = '20160101' # first day, data generated over 1 year. d1 = \"%Y%m%d\") a_n = [] for i in range(j): n = y = 'J_' + n + d1 = d1 + a_n.append j_names = dict((el, int( j_l(jl_1,j / 2 )) for el in a_n) # determine how many lines are in each journal. return j_names Create the list of journal names and account codes\u00b6 In\u00a0[6]: j_names = a_names = Create the fake General Ledger and save it to a text file\u00b6 In\u00a0[7]: # Output format glf = f = open('gl.t 'w') f.write(gl + '\\n') for key in key=lambda y: y.lower()) line_no = -1 i = 0 # Assign each journal a source feed source_id = # Assign each journal a posting date posting_da = d1 = d1 + # Make journal either M or A, if M assign user t = < Mp # True implies U, 3*U/4) if t: man_ind = 'M' u_name = ul[int(p)] else: man_ind = 'A' u_name = '' # Assign functional amount to each line while i < j_names[ke i = i + 2 line_no = line_no + 2 line_no2 = line_no + 1 dr = q(q1,q2) cr = -1 * dr a_names_p = a_names)) a_names_b = a_names)) an1 = an2 = l_1 = key + '|' + str(line_n + '|' + man_ind + '|' + posting_da + '|' + u_name + '|' + an1 + '|' + source_id + '|' + 'GBP' + '|' + str(dr) l_2 = key + '|' + str(line_n + '|' +man_ind + '|' + posting_da + '|' + u_name + '|' + an2 + '|' + source_id + '|' + 'GBP' + '|' + str(cr) f.write(l_ + '\\n') f.write(l_ + '\\n') f.close() Create the Trial Balance and save it to a text file\u00b6 In\u00a0[8]: # Use gl to calc movement on each account gl = sep = '|') tb = # Calc net movement on each account tb = inplace=Tr tb.columns = = 'Account') # Assign account type # Set b/f balances to 0 for P&L accounts == 'P', 'Balance b/f'] = 0 == 'P', 'Type'] = 'P&L' == 'B', 'Type'] = 'BS' tb['Balanc b/f'] # if b/f balance is != 0, generate a balance for that account i = 0 for index, row in tb.iterrow if row['Balan dummy'] != 0: row['Balan b/f'] = bal = b/f'] = bal b/f'] = -1 * bal i += 2 del tb['Balanc dummy'] # create c/f field tb['Balanc c/f'] = ( tb['Balanc b/f'] + tb['Moveme ).round(2) # create 'date of balance' column tb['Balanc date'] = # Arrange columns tb = tb[['Accou 'Type', 'Balance b/f' , 'Balance c/f', 'Balance date']] # print TB to file sep='|', header=Tru index=Fals Load the text files back in and display their top 10 rows\u00b6Verif that the files have been produced correctly and that the TB balances as\u00a0expecte In\u00a0[9]: gl = sep = '|') tb = sep = '|') In\u00a0[10]: tb.head(10 Out[10]: Account Type Balance b/f Balance c/f Balance date 0 ACB00003 BS 673.10 72348.68 20161231 1 ACB00007 BS -673.10 20045.09 20161231 2 ACB00010 BS 748.16 -30340.79 20161231 3 ACB00012 BS -748.16 188.39 20161231 4 ACB00015 BS 814.96 48294.11 20161231 5 ACB00017 BS -814.96 10659.80 20161231 6 ACB00021 BS 835.56 18357.33 20161231 7 ACB00032 BS -835.56 3406.80 20161231 8 ACB00034 BS 759.60 26505.40 20161231 9 ACB00036 BS -759.60 -39128.80 20161231 In\u00a0[11]: gl.head(10 Out[11]: Journal_ID Line Type Date User Account Source 0 J_20160101 1 M 20160101 Iain Gardiner ACP00054 sl2 GBP -587.49 1 J_20160101 2 M 20160101 Iain Gardiner ACB00064 sl2 GBP 587.49 2 J_20160101 3 M 20160101 Iain Gardiner ACP00022 sl2 GBP 816.17 3 J_20160101 4 M 20160101 Iain Gardiner ACB00017 sl2 GBP -816.17 4 J_20160101 5 M 20160101 Iain Gardiner ACP00088 sl2 GBP 628.60 5 J_20160101 6 M 20160101 Iain Gardiner ACB00062 sl2 GBP -628.60 6 J_20160101 7 M 20160101 Iain Gardiner ACP00079 sl2 GBP -672.34 7 J_20160101 8 M 20160101 Iain Gardiner ACB00017 sl2 GBP 672.34 8 J_20160101 1 A 20160101 NaN ACP00005 sl1 GBP -683.52 9 J_20160101 2 A 20160101 NaN ACB00036 sl1 GBP 683.52 In\u00a0[12]: print('Net Opening TB:',\"%.2f % tb['Balanc b/f'].sum( print('Net Closing TB:',\"%.2f % tb['Balanc c/f'].sum( Net Opening TB: 0.00 Net Closing TB: 0.00 if { var mathjaxscr = = = = ? \"innerHTML : \"text\")] = + \" config: + \" TeX: { extensions { autoNumber 'AMS' } },\" + \" jax: + \" extensions + \" displayAli 'center',\" + \" displayInd '0em',\" + \" showMathMe true,\" + \" tex2jax: { \" + \" inlineMath [ ['$','$'] ], \" + \" displayMat [ ['$$','$$' ],\" + \" true,\" + \" preview: 'TeX',\" + \" }, \" + \" 'HTML-CSS' { \" + \" linebreaks { automatic: true, width: '95% container' }, \" + \" styles: { .MathJax .mo, .MathJax .mi, .MathJax .mn': {color: 'black ! important' }\" + \" } \" + \"}); \"; (document. || }"},{"title":"Reconciliation of a trial balance to a general\u00a0ledger","category":"Technical/Data","url":"reconciliation.html","date":"3 January 2017","tags":"finance, accounting, journals, general ledger, trial balance, python ","body":"I\u2019ve been working with financial ledgers a lot recently. The python code below shows an automated workflow to import, process and report on the reconcilia of a Trial Balance (TB) to a General Ledger (GL). I\u2019m using fake data, but the script would work fine with real data if the fields were renamed appropriat A real data set would have additional fields that needed to be considered but these vary depending on the size and type of business being analysed. Therefore the fake GL and TB used here are simple and generic. Additional fields, such as entity code, transactio status, approver, time stamps, etc can be added quickly and\u00a0simply Set-up the notebook and import\u00a0dat In\u00a0[1]: # Import libraries import pandas as pd import random In\u00a0[2]: # Import (possibly incomplete and/or inaccurate data gl = sep = '|') tb = sep = '|') Create Calculate net movement for each account in the ledger\u00a0dat In\u00a0[3]: gl_move = gl_move = inplace=Tr = = 'Account') inplace = True) Calculate the movement for each account in the trial\u00a0bala In\u00a0[4]: = ( tb['Balanc c/f'] - tb['Balanc b/f'] ).round(2) inplace = True) Compare each accounts movement in the ledger data and the trial balance and write the result to a report containing the reconcilia results for all\u00a0accoun In\u00a0[5]: Rec_report = how = 'outer', left_index = True, right_inde = True) = - = date'] = sep='|', header=Tru index=True Put the accounts which do not reconcile into a In\u00a0[6]: Unreconcil = != 0] Unreconcil = = False).ind sep='|', header=Tru index=True Unreconcil Out[6]: Type Balance b/f Balance c/f Balance date TB_Movemen GL_Movemen difference Account ACP00081 P&L 0.00 -16279.96 20161231 -16279.96 -17017.89 737.93 ACB00082 BS 760.09 -4041.02 20161231 -4801.11 -4063.18 -737.93 ACP00071 P&L 0.00 -28547.84 20161231 -28547.84 -29260.31 712.47 ACB00091 BS 628.24 3054.74 20161231 2426.50 3138.97 -712.47 ACP00017 P&L 654.01 24449.74 20161231 23795.73 23123.06 672.67 ACB00076 BS 768.49 -48456.11 20161231 -49224.60 -48551.93 -672.67 ACB00037 NaN NaN NaN nan NaN 19808.54 NaN ACP00001 NaN NaN NaN nan NaN -3817.92 NaN ACP00041 NaN NaN NaN nan NaN -14365.77 NaN ACP00046 NaN NaN NaN nan NaN 1825.79 NaN ACP00086 NaN NaN NaN nan NaN -7263.37 NaN Create a report containing In\u00a0[7]: # Accounts not in the TB but in the GL = == True] In\u00a0[8]: # Accounts in the TB where TB_Movemen isn't matched in the GL = == ==False)] In\u00a0[9]: Total_Acco = = Total_Acco - = Rec_Fracti = / Total_Acco Unrec_Frac = / Total_Acco In\u00a0[10]: with \"w\") as text_file: print('The data 'accounts' 'accounts reconcile (', '%)\\n', 'accounts do not reconcile (', '%)', In\u00a0[11]: with \"a\") as text_file: print('The are', 'accounts in the GL and not in the TB. (', '% of unreconcil accounts)' print('The are', 'accounts with journals missing (', '% of unreconcil accounts)' In\u00a0[12]: net_diff = == with \"a\") as text_file: print('The net of all the difference is', net_diff, In\u00a0[13]: # Does the TB balance? with \"a\") as text_file: print('\\nT opening balance is unbalanced by', \"%.2f\" % tb['Balanc b/f'].sum( print('TB closing balance is unbalanced by',\"%.2f\" % tb['Balanc c/f'].sum( print('*** these are not 0 then the TB is certainly wrong and receiving a \\nbalanced TB is the first step to reconcilin all accounts** In\u00a0[14]: diffs = frequency = {} for w in diffs: frequency[ = 0) + 1 pairs = {x for x in frequency if x > 1} # dict comprehens to filter for pairs with \"a\") as text_file: print('\\nT are', len(pairs) 'unreconci accounts with equal and opposite difference In\u00a0[24]: with 'r') as fin: The data contains 110 accounts 99 accounts reconcile ( 90.0 %) 11 accounts do not reconcile ( 10.0 %) There are 5 accounts in the GL and not in the TB. ( 45.45 % of unreconcil accounts) There are 6 accounts with journals missing ( 54.55 % of unreconcil accounts) The net of all the difference is 0.0 TB opening balance is unbalanced by -2486.28 TB closing balance is unbalanced by 1326.45 ***If these are not 0 then the TB is certainly wrong and receiving a balanced TB is the first step to reconcilin all accounts** There are 3 unreconcil accounts with equal and opposite difference if { var mathjaxscr = = = = ? \"innerHTML : \"text\")] = + \" config: + \" TeX: { extensions { autoNumber 'AMS' } },\" + \" jax: + \" extensions + \" displayAli 'center',\" + \" displayInd '0em',\" + \" showMathMe true,\" + \" tex2jax: { \" + \" inlineMath [ ['$','$'] ], \" + \" displayMat [ ['$$','$$' ],\" + \" true,\" + \" preview: 'TeX',\" + \" }, \" + \" 'HTML-CSS' { \" + \" linebreaks { automatic: true, width: '95% container' }, \" + \" styles: { .MathJax .mo, .MathJax .mi, .MathJax .mn': {color: 'black ! important' }\" + \" } \" + \"}); \"; (document. || }"},{"title":"Spotify song\u00a0history","category":"Technical/Data","url":"spotify.html","date":"22 December 2016","tags":"music, spotify ","body":"Spotify recently sent me their \u20182016 Wrapped\u2019 email containing a few statistics about my listening habits in 2016, and a playlist of my 101 most played\u00a0son I decided to compare their statistics with those I\u2019d gathered myself. There is an option in the Preference menu to log (\u2018scrobble your Spotify listening history to last.fm. from where you can then download your data. I thought it would be interestin to verify my Spotify stats, and see if I could discover any other 1. Notebook set-up\u00b6 In\u00a0[1]: # Display plot results inline, not in a separate window %matplotli inline %pylab inline # Set the size of all figures = (14, 5) # Import pandas and a nicer plotting style import pandas as pd import as plt plt.style. = 'default' Populating the interactiv namespace from numpy and matplotlib In\u00a0[2]: # Make the dataframes look a bit prettier from import HTML css = + # Print output to 2 decimal places only = # Just works for this cell only # This sets behaviour for entire notebook %precision %.2f Out[2]: '%.2f' 2. Load and explore the data\u00b6 In\u00a0[3]: # Load in the data from the .csv file I downloaded column_nam = ['Artist', 'Album', 'Name', 'd_t'] # Choose the column names df = names = column_nam header=Non # Convert the date time column (d_t) from a string to date time, so that I can search by date df['d_t'] = Some initial questions to answer to get a feel for the\u00a0data: How much data is\u00a0there? What type of data is\u00a0present What are the maximum and \u2026etc. In\u00a0[4]: # View the top 10 rows df.head(10 Out[4]: Artist Album Name d_t 0 Bonobo Animal Magic Kota NaT 1 Samaris Samaris G\u00f3\u00f0a tungl 2016-12-15 09:44:00 2 The Staves Dead & Born & Grown Mexico 2016-12-15 09:40:00 3 Bonobo Animal Magic Kota 2016-12-15 09:34:00 4 The Gloaming The Gloaming The Sailor\u2019s Bonnet 2016-12-15 09:30:00 5 St. Germain St Germain Forget Me Not 2016-12-15 09:24:00 6 Yosi Horikawa Wandering Bubbles 2016-12-15 09:19:00 7 South Music from The O.C. Mix 1 Paint the Silence 2016-12-15 09:13:00 8 The Staves Sleeping In A Car Sleeping In A Car 2016-12-15 09:09:00 9 Bonobo Days to Come Recurring 2016-12-15 09:04:00 In\u00a0[5]: #How many rows are there? len(df) Out[5]: 36454 In\u00a0[6]: # How many 2016 rows == 2016]) Out[6]: 1983 In\u00a0[7]: # Are there any NaN values in 2016? Ans = == 2016) & print('The are', Ans, 'rows with NaN values in 2016') There are 0 rows with NaN values in 2016 In\u00a0[8]: # Maximum and minimum dates print(\"Old record print(\"Mos recent record Oldest record is: 1970-01-01 00:00:00 Most recent record is: 2016-12-15 09:44:00 Scrobbling to last.fm was turned off for the first half of\u00a02016. In\u00a0[9]: #print(\"Ol record in 2016 == in 2016 began == Scrobbling in 2016 began at: 2016-06-14 20:24:00 Therefore it will be important to remember that the data being analysed only represents about half of\u00a02016. 3. Verify Spotify\u2019s statistics Total Minutes\u00b6 According to Spotify, I listened to 2357 unique artists this year, and 3309 unique tracks. Total streaming time was 45202 minutes. They dont tell me how many non-unique plays I\u00a0accumula Googling \u2018average song length\u2019 reveals that most pop songs last 3.5 minutes, but I think the songs I listen to are longer than\u00a0that. Spotify\u2019s UI is understand light on details, so initially I wondered how to calculate my average song length. First I just eyeballed the data (\u2018scanning analytics. Then I realised I could sort the playlist by song length and then I could pick the median, although I\u2019d need to know how many songs were in the\u00a0playli Whilst searching for playlist properties like \u2018number of songs\u2019 I simply spotted what I needed below the descriptio 101 songs, total play time of 8hrs, 29\u00a0minutes Spotify tell me I\u2019ve spent 45202 minutes streaming music and listened to 3309 unique tracks in\u00a02016. Therefore: In\u00a0[10]: # Average length of my 101 most listened to songs Average = round((8*6 + 29)*60/101 print(\"Ave song length:\", Average, \"seconds\") print(\"or 5 minutes, 2 seconds \\n\") # Time spent listening to Spotify in days print(\"Day listening to Spotify in 2016:\", A whole month?!? \\n\") # Time spent listening to Spotify in seconds seconds_us = 45202 * 60 # Number of songs played n_tracks = / Average,2) print(\"Num of songs played in # Average number of plays per songs unique_tra = 3309 # From summary email average_pl = round(n_tr / print(\"On average, I listen to each track\", average_pl \"times\") Average song length: 302.38 seconds or 5 minutes, 2 seconds Days listening to Spotify in 2016: 31.46 ...Really? A whole month?!? Number of songs played in 2016: 8969.24 On average, I listen to each track 2.71 times This is a surprising large amount of time spent listening to music. I cannot believe I spent 1/12 of 2016 streaming music from Spotify. I also thought I was quite a repetitive listener, so the average of 2.7 plays per song is surprising low. (For example, I have a short playlist with \u2018Recurring by Bonobo in it 5\u00a0times.) 3b. Unique to Spotify, I listened to 2357 unique artists in\u00a02016. In\u00a0[11]: Ans = == print('Num of unique artists scrobbled in 2016:', Ans) Number of unique artists scrobbled in 2016: 510 If all songs were being scrobbled to last.fm and my listening behaviour remained the same thoughout the year, I would have expected the result to be about half of Spotify\u2019s result (because I only began scrobbling in June). The summary email reported 2357 unique artists in 2016. 510 is clearly much less than half of\u00a02357. 3c. Unique tracks\u00b6 In\u00a0[12]: Ans = == print('Num of unique tracks scrobbled in 2016:',Ans Number of unique tracks scrobbled in 2016: 1060 Spotify reported 3309 unique tracks played in 2016. My result of 1060 unique tracks in only the second half of 2016 is less than I would have expected, though the difference between expectatio and result is not as great as for 3d. My top to Spotify, my 3 most played tracks in 2016\u00a0are: Lights Out Words\u00a0Gone Shuffle Luna \u2026all by Bombay Bicyle\u00a0Clu In\u00a0[13]: # Which tracks did I listen to the most in 2016? == Out[13]: Miserere 14 Shuffle 14 Lights Out Words Gone 12 The Pilgrim\u2019s Song 12 Always Like This 12 Luna 11 The Hare 11 You Already Know 11 Koop Island Blues 10 The Sailor's Bonnet 10 Name: Name, dtype: int64 This shows that whilst the \u2018top tracks\u2019 are some of my most listened to tracks, Spotify either uses a different method to calculate plays, or \u2018top\u2019 is not synonymous to When using data from June onwards, my \u2018top tracks\u2019 are only my joint 1st, 3rd and 6th most 3e. Top from\u00a0Spoti Bonobo Bombay Bicyle\u00a0Clu The\u00a0Staves Jack\u00a0Johns In\u00a0[14]: == Out[14]: The Staves 170 Bonobo 161 The Gloaming 132 Koop 107 Bombay Bicycle Club 105 Giovanni Pierluigi da Palestrina 70 Jimi Hendrix 48 Jack Johnson Gregorio Allegri 43 Broken Social Scene 33 Name: Artist, dtype: int64 My results are different to Spotify\u2019s, but not by\u00a0much. Artist Spotify\u2019s results My result Bonobo 1 2 Bombay Bicycle Club 2 5 The Staves 3 1 Thievery Corporatio 4 - Jack Johnson 5 8 Apart from Thievery Corporatio being conspicuou absent from my own scrobbled data, all other difference could reasonably be explained by listening trends being different from Jan - June than July -\u00a0Dec. The results show that I prefer listening to Bonobo over Bombay Bicycle Club, but that my 3 most played songs are all by Bombay Bicyle Club. This shows that there are more songs by Bonobo that I enjoy listening to, and that I have more polarised opinions about songs by Bombay Bicyle\u00a0Clu 3f. Favourite day to to Spotify, I stream more music on Saturday than any other\u00a0day. In\u00a0[15]: # Plot how many songs wer played on each day of the week in the second half of 2016 == Out[15]: at 0x111243c8 Monday=0 and\u00a0Sunday This result shows that I stream more music on a Sunday than any other day. This is a different result to The small difference could be explained (again) by my data set only relating to the second half of 2016. From January to June I listened to more music on a Saturday than a Sunday - probably when I was playing Starcraft. This hobby unfortunat was put on ice and had to come second to revision for exams in July and\u00a0Novemb 4. Time of day\u00b6I thought it would be interestin to also see how much music I listen to at different times. I expect my listening habits to be different during the week compared to weekends, so I split the data by day and then summed the songs played whilst grouping them by\u00a0hour. The figures below show an unexpected dip around 3pm at weekends, and consistent streaming through the day whilst at the office during the week. No matter which day of the week, it appears I always listen to music in the\u00a0evenin In\u00a0[16]: # Count number of songs played during each hour at the weekends == 2016) & > Out[16]: at 0x110493dd In\u00a0[17]: # Count number of songs played during each hour during the week == 2016) & < Out[17]: at 0x11048fa5 5. of the attempts to verify Spotfiy\u2019s statistics fall short because my data only covers the second half of 2016. This shows the requiremen of good quality inputs in order to achieve good Having said that, it is clear that I listen to Spotify far more than I thought I did. The observatio that I like many of Bonobo\u2019s tracks but have more polarised views about Bombay Bicycle Club\u2019s songs is novel. Most of all, I think I\u2019m getting great\u00a0valu The chart below is number of songs played each day from 14 June 2016 to 15 December\u00a02 My top 101 songs of 2016 can be played at: In\u00a0[18]: import warnings figure = == foo = n = 3 ticks = ticklabels = [l.get_tex for l in if { var mathjaxscr = = = = ? \"innerHTML : \"text\")] = + \" config: + \" TeX: { extensions { autoNumber 'AMS' } },\" + \" jax: + \" extensions + \" displayAli 'center',\" + \" displayInd '0em',\" + \" showMathMe true,\" + \" tex2jax: { \" + \" inlineMath [ ['$','$'] ], \" + \" displayMat [ ['$$','$$' ],\" + \" true,\" + \" preview: 'TeX',\" + \" }, \" + \" 'HTML-CSS' { \" + \" linebreaks { automatic: true, width: '95% container' }, \" + \" styles: { .MathJax .mo, .MathJax .mi, .MathJax .mn': {color: 'black ! important' }\" + \" } \" + \"}); \"; (document. || }"},{"title":"Vim!","category":"Technical/Developer Tools","url":"vim.html","date":"1 December 2016","tags":"vim ","body":"Vim is a text editor renowned for its efficiency and its use of keyboard shortcuts. It\u2019s based on the Vi text editor from the 1970\u2019s. It was first released in 1991 and is still being developed today. It comes pre-instal on Unix systems (including MacOS) and can be run from the\u00a0termin Vim is famous in another way too - for being difficult to learn. I found some good and remarkably creative tools to begin learning its concepts and controls. This was necessary because there is no GUI. There is a game here, and there is this interactiv tutorial. There\u2019s also a built-in vim tutorial - just type \u2018vimtutor\u2019 into\u00a0Termi Vim is designed so that you don\u2019t need to take your hands off your keyboard and use a mouse. It has the \u2018insert\u2019 mode where you enter text as usual, and the \u2018command\u2019 mode where you can make use of a comprehens and flexible shortcut language to move around, edit and search the text. With no GUI or toolbar, it\u2019s a very different approach to text editing than I\u2019m used\u00a0to. You can run Vim from the terminal, but there are also versions that run as apps. MacVim on MacOS has the option to show a tool bar of simple commands like a normal program, and lets the arrow keys move the cursor in addition to VIM\u2019s \u2018hjkl\u2019 functional This makes getting started a There are also a lot of plugins to extend Vim\u2019s functional and turn it from a text editor into an IDE. This post walks you through setting up Vim as a Python IDE and explains how to manage I recommend Daniel Mieslers blog post for a quick overview of how to use\u00a0Vim."},{"title":"Autumn, BIN and $PATH","category":"Technical/Developer Tools","url":"autumn-bin-and-path.html","date":"1 December 2016","tags":"unix ","body":"Two small things have been learnt recently: the importance of PATH and the contents of various BIN\u00a0folder Autumn 2016 has not gone as planned. Whilst studying for a couple of exams plans were put on hold and hobbies were ceased. Now that life is returning to normal, I have opportunit to post\u00a0again PATH $PATH is a variable (string) which contains a series of folder locations separated by \u201c:\u201d. Each of these folders contains programmes When you type the name of a programme into terminal without specifying its location, the OS looks sequential in each of the folder locations listed in $PATH to see if the programme is there, and then executes\u00a0i BIN Bin as in Binary, not Bin as in\u00a0Trash. The bin folders contain binary files, which are programmes ready to be\u00a0run. If I run \u201cecho $PATH\u201d from the Terminal, I see 9 folders called bin, and its only by convention that they contain binaries. They are just normal folders, which the OS is set to look in when asked to run"},{"title":"Introduction to my doctorate research -\u00a0Silos","category":"Technical/Other","url":"silos.html","date":"2 October 2016","tags":"doctorate, granular flow, granular materials, phd, research, engineering, silo, thesis, vienna ","body":"Background From Spring 2010 until the Autumn of 2013, I was a PhD candidate living in Vienna, Austria and working at the University of Natural Resources and Life Sciences. Before working in Vienna, I completed my Masters degree in Civil and Environmen Engineerin at the University of\u00a0Edinbur My research quantified the effects of changing the amount of gravity acting on granular materials as they poured out of a silo. My thesis and the short presentati I used to defend it are\u00a0availa Granular materials are a broad class of materials that are encountere everyday - salt, pills, breakfast cereal, sand, rice, soil, landslides are all granular materials. They are ubiquitous and occur in many different sizes and\u00a0variet Silos are a common type of container for storing granular materials. You pour the granular material in from the top, store it for a while, and then dispense the material in controlled quantities from the\u00a0bottom Research\u00a0f My research focussed on quantifyin how changes in gravity affected the material contained inside a silo, particular whilst the silo was being emptied. This is pertinent because engineers and scientists do not yet have a scientific understand of how granular materials behave. Whilst gravity clearly affects a granular materials, we cannot say exactly\u00a0ho This means we can\u2019t use analytical methods to quantify the physics that occur in a real system. Instead we use empirical methods guesses, and knowledge of what worked in previous similar situations This isn\u2019t necessaril bad, but it is less efficient and less reliable than an I built a small model silo (30cm tall) and put it into quite a large centrifuge (3 metre diameter). By rotating the model silo around the centre of the centrifuge at a constant speed I could simulate a higher gravity. I added high-speed cameras, pressure sensors and weighing scales so that I could measure how the material was moving once I opened the silo outlet and the silo began to empty. Photos of the experiment model I built can be seen\u00a0below I also programmed a computer model (using the commercial PFC 3D software and working in the FISH scripting language) to simulate and investigat if the same behaviours could be observed numericall as\u00a0physica The class of computer model I used is known as DEM (Discrete Element Modelling) These models work by considerin every grain of material individual usually as a sphere. If one sphere overlaps with another (i.e. the distance between the two particle centres is less than the sum of their radiuses) then a force proportion to the overlap size repels the two spheres away from each other. This simple approach is repeated over every grain or particle in the model, and produces life like behaviour in many situations It has many advantages over \u201ccontinuum based techniques that model groups of grains as if they were all just one big particle with unusual properties DEM has one massive limitation though. It requires huge amounts of computatio resources - and this limits its use in industrial scenarios outside. Until computers become much, much more powerful, DEM will only be used for theoretica research and Results When gravity increases by a factor of \\(x\\), both the discharge rate and local velocities within the silo increase by \\(\\sqrt(x) That\u2019s good to know if you\u2019re planning on storing stuff on the moon, but it\u2019s also a useful step towards explaining exactly why bulk granular materials behave the way they\u00a0do. An overview of my research, my doctorate thesis and published papers can be downloaded below. PDFs containing 3D models and movies require flash to render, and Adobe Reader Desktop must be used in order to view\u00a0them. PhD Thesis\u00a0(20 Modeling silo discharge in a centrifuge Experiment investigat of flow and segregatio behaviour of bulk solids in silos under high gravity conditions (Particles 2013 Centrifuga modelling of granular flows (Eurofuge Overview of research (PhD defence,\u00a02 if { var align = \"center\", indent = \"0em\", linebreak = \"false\"; if (false) { align = (screen.wi"},{"title":"Encryption","category":"Technical/Cryptocurrencies","url":"encryption.html","date":"30 August 2016","tags":"encryption, rsa, dlt, blockchains, bitcoin, digital currencies, public key cryptography ","body":"Blockchain use Elliptical Curve Cryptograp (ECC) to authentica users and authorise transactio These notes introduce the field of cryptograp and explains how modern cryptograp methods work. I wrote them to teach myself about encryption 1 To begin with the absolute basics, encryption generally works by taking a message and representi it as a series of numbers 2 which are then turned into a series of random looking numbers. Decryption works by turning the random looking numbers back into the Background The history of Cryptograp can be split into classical and modern eras. Modern cryptograp began in 1977 with the introducti of the RSA and Diffie-Hel algorithms Until then, Cryptograp required using a single key (the secret code) to encrypt and decrypt the message. This was transferre from sender to receiver secretly, but not In classical cryptograp the code is a shared\u00a0sec The modern era removed the requiremen for a shared secret and instead used number theory as a basis for quantifyin the strength of an encryption method. The strength of a modern cryptograp technique is quantifiab and provable by reference to number theory, rather than a users ability to transport or transfer a secret\u00a0cod Modern cryptograp is defined by Public Key Cryptograp systems. They use one key (code) for encryption and another for decryption The encryption key can be made public without any risk of the message being decrypted, and is therefore known as the public key. The key used to decrypt data is the private key, and must not be revealed. If a message is encrypted with the public key it can only be decrypted with the private\u00a0ke Public Key Cryptograp (PKC) systems use algorithms that are easy to process in one direction but difficult to reverse, which are known as mathematic trap-door functions. A simple example of a trap-door function is the product of two prime numbers. If two random prime numbers are chosen, it is trivial to find their product. However if only the product of the two numbers is known, it is relatively difficult to find either of the factors used to create the number (this is This was first noticed in 1874, when \u201cCan the reader say what two numbers multiplied together will produce the number 8,616,460, I think it unlikely that anyone but myself will ever\u00a0know. This simple problem shows that finding the product of two (secret) prime numbers is simple, but factorisin the result is not. This type of problem is a key feature of modern cryptograp Factoring prime numbers is a super famous mathematic problem, it was studied by Eratosthen 3 in the 3rd century BC and more recently the RSA Factoring Challenge has intended to track factorisat techniques by issues cash prizes for the factorisat of products of large\u00a0prim Generally, the bigger the difference in difficulty between executing the function and reversing it, the better the The RSA algorithm below uses factorisat as the foundation of its security, but factorisat is not the hardest problem to solve relative to the size of the keys required. Algorithms have been developed to factor the products of large prime numbers, and are much more efficient than randomly guessing possible factors. The greater the size of the primes being factored, the more efficient these algorithms become, and therefore the difference in difficulty between executing the function (multiplyi two large primes) and reversing it becomes smaller as the size of the cryptograp key length increases. This is a problem because as public key cryptograp becomes more commonly used the resources available to factor products of primes increases, and consequent larger keys are\u00a0requir Ultimately encryption techniques based on the difficulty of factorisat will become redundant as the difficulty gap between creating and solving them shrinks. A better trap door function is\u00a0require Overview of the RSA\u00a0algori Named after its founders (Ron Rivest, Adi Shamir, and Leonard Adleman), RSA was one of the first public-key encryption algorithms and is still widely\u00a0use RSA (as well as other cryptograp techniques makes use of a number line which loops back to zero after reaching a maximum value, rather than increasing indefinite This means that once a maximum number \\(n\\) has been defined, if a number greater than \\(n\\) is created the result simply loops around to 0 and begins counting from 0 again. i.e. if \\(n = 10\\), then \\(7 + 5 = 12 - 10 = 2\\). The result of a calculatio on a looping number line may easily be found by doing long division and using the remainder as the final answer, i.e. \\(12 / 10 = 1\\) with Generation of a pair of RSA\u00a0keys: 1. Generate the RSA\u00a0module Select two large random prime numbers, \\(p\\) and \\(q\\). They need to be random because anyone who knows or guesses them will be able to decrypt Calculate \\(n =\u00a0pq\\) 2. Find derived number\u00a0(e) e must be greater than 1 and less than \\(( p - 1)( q - 1)\\). There must be no common factor for e and \\(( p - 1)( q - 1)\\) except for 1. 4 3. Form the public\u00a0key The pair of numbers \\((n, e)\\) form the public key and can be made\u00a0publi Even though \\(n\\) is public, it is so difficult to factor the product of 2 large prime numbers that an attacker would not be able to find its component primes in the time available. The strength of RSA rests entirely on the difficulty of factoring \\(n\\) into its two component prime\u00a0numb 4. Generate the private key\u00a0(d) The private key is generated from using \\(p\\), \\(q\\) and e as inputs to the Extended Euclidean Algorithm. For a given set of values, there is a unique answer \\(d\\). \\(d\\) is the inverse of \\(e\\) modulo \\(( p - 1)( q - 1 )\\). This means that \\(d\\) is the number less than \\(( p - 1 ) ( q - 1 )\\) such that when it is multiplied by e, it is equal to \\(1\\) modulo \\(( p - 1 ) ( q - 1 )\\). RSA\u00a0exampl RSA does not directly operate on strings as bits, it operates on numbers modulo (less than) \\(n\\). and it is necessary to represent plain text as a series of numbers less than \\(n\\). The dominant encoding on the internet is UTF-8, which represents each upper case Latin letter as a number between 65 and 90. Using this encoding, a message \u201cHELLO\u201d would become \u201c\\(72, 69, 76, 76, 79\\)\u201d. The maximum number \\(n\\) needs to be the product of the two prime numbers \\(p\\) and \\(q\\). For this example choose \\(p = 7\\) and \\(q = 13\\), so \\(n = 91\\) 5 The public key component e can be any number we choose, as long as there is no number other than 1 which is a common factor of e and \\(( p - 1 ) ( q - 1 )\\). In our example, this requires that there be no common factor between 72 and e other than 1, so let e \\(= 5\\). Therefore our public key is (91, 5). This can be made available to anyone without messages being decrypted because of the difficulty of factoring the product of (very large) prime\u00a0numb Using the fact that we know 5 and 11 are the prime factors of 55 and e is 5, we can use the Extended Euclidean Algorithm to compute our private key \\(d\\), which is\u00a029. Therefore when the prime factors 7 and 13 are used, the public key is (91, 5) and the private key is (91, 29). These parameters fully define a functional RSA\u00a0system Encoding To encode a letter H in a message (\u2018H\u2019 is \\(72\\) in UTF-8), we need to multiply it by itself \\(e\\) times (\\(e = 5\\)), rememberin to wrap around each time we pass our maximum value of \\(n = 91\\). \\(72 \\times 72 = 5184, 5184 / 91 = 56\\) with \\(88\\) remaining, (i.e. \\(5184 = 91 \\times 56 + 88\\)). Therefore: \\(72 \\times 72 = 5184 = 88\\) \\(88 \\times 72 = 6336 = 57\\) \\(57 \\times 72 = 4104 = 9\\) \\(9 \\times 72 = 648 =\u00a011\\) Therefore the encrypted value of \u201cH\u201d is \u201c\\(11\\)\u201c Using the method for each character in the message \u201cHELLO\u201d results in the encoded message To decrypt the message, we take each number and multiply it by itself \\(d\\) times, (\\(d=29\\)) wrapping around each time we pass \\(91\\). \\(11 \\times 11 = 121 = 30\\) \\(30 \\times 11 = 330 = 57\\) \u2026 \\(57 \\times 11 = 627 = 81\\) \\(81 \\times 11 = 891 =\u00a072\\) And we\u2019re back to our Files The spreadshee I used to calculate the encrypted and decrypted values can be downloaded here. A simple python script to encrypt and decrypt a message is here. It uses the AES Footnotes I used the explanatio here and here a lot.\u00a0\u21a9 A simple example is \\(A=1, B=2\\) etc\u00a0\u21a9 Eratosthen invented his famous sieving algorithm which finds all the primes up to a given limit.\u00a0\u21a9 If this is the case then e and ( p - 1) ( q - 1 ) are called \u201ccoprime\u201d\u00a0 Whilst the Extended Euclidean Algorithm is apparently simple to compute, its descriptio is not. Therefore I\u2019ve used the same numbers in the following example as in the tutorials here and here.\u00a0\u21a9 if { var align = \"center\", indent = \"0em\", linebreak = \"false\"; if (false) { align = (screen.wi"},{"title":"Digital currencies: the\u00a0basics","category":"Technical/Cryptocurrencies","url":"digital-currencies-the-basics.html","date":"19 August 2016","tags":"bitcoin, cryptocurrencies, cryptography, digital currencies, finance, fintech ","body":"Digital currencies are often discussed in the context of finance, technology and economics. The Blockchain - the technology which applicatio like Bitcoin are built on - is significan because it removes the need for trust or an intermedia between unrelated parties transactin with each other. So far, the most influentia and famous digital currency is\u00a0Bitcoin This post is intended to introduce the basic concepts of digital currencies and the problems a distribute ledger system needs to\u00a0overcom What is a A digital currency is an internet based medium of exchange. Units of digital currency are not printed, are not physical, and represent nothing. A unit of currency is produced by running algorithms to solve complex mathematic problems. When a solution is found, a unit of currency (for example, 1 Bitcoin) is\u00a0generat If the currency represents nothing, why is it\u00a0valuabl Because people believe that in future, other people will believe it does, and because people are willing to trade real goods and services in exchange for\u00a0it. This is the same as for dollars, sterling and euros (fiat currencies which also don\u2019t represent anything physical. (Although these examples are supported by laws In the past, creating a new currency without the support of government hasn\u2019t been A central bank was required to control the physical creation of new currency (otherwise people would create counterfei currency, decreasing scarcity and moving its value towards\u00a0ze An intermedia (a bank) was required for all large or remote transactio to make sure that the amount of money each party owns is correctly recorded and updated in a ledger (preventin double spending of\u00a0funds) The technologi breakthrou was preventing double spending without requiring an intermedia This is made possible by using cryptograp techniques developed over the last few decades, and cheap, powerful computers which have only recently Central and With convention currencies everyone\u2019s balance and transactio are recorded in one central ledger (a list showing how much money each account has) and each account holder only has access to their own balance and transactio With digital currencies a copy of the entire ledger (every transactio ever made by everyone) is held on each computer (or node), and anyone can see if two parties wish to make a remote transactio then they need a bank to be the intermedia The bank mediates by updating the central ledger to record the change in each parties funds as a result of the transactio This is how one party knows if the counter-pa is able to pay, and how payment is confirmed. If there is only one copy of the ledger, maintained by the bank, then the bank must be involved in every transactio between its account holders. This need for an intermedia increases the complexity and cost of Sending\u00a0mo To send money, a message is broadcast to the network that the amount in your account should decrease and the amount in another account should increase. Each computer in the network (a node) which receives this message will check its authentici make the changes, and pass the message along to other\u00a0node What problems does For a transactio to be accepted and entered into the distribute ledger, its authentici needs to be verified. Because the ledger is distribute everyone can see everyone else\u2019s transactio Therefore user authentica and transactio authorisat needs to be possible without compromisi a user\u2019s ability to send secure payments in\u00a0future. There is also the problem of double spending - because the currency is neither physical, printed or representa of anything, how do you prevent a user from spending their currency more than once, or simply creating as much new currency as they\u00a0want? Another problem is the addition of new transactio to the ledger from many unrelated users. If each party has their own copy of the ledger, updating (or changing) it as they want, how would the ledgers completene and accuracy be assured? How would you update your ledger to take account of transactio between third parties, and how would you know the order in which they occurred?1 The blockchain is remarkable because it is the first technology to solve all of these problems. Future posts will consider each of these problems are\u00a0overco This is the Byzantine Generals problem, which is nicely described in the introducti of this paper\u00a0\u21a9"},{"title":"Spare\u00a0time","category":"Non-technical/Learning","url":"spare-time.html","date":"13 August 2016","tags":"habits, advice, reflection ","body":"This is a list of interests I want to consider pursuing. I wrote it when I began to study for my last set of exams and my mind filled with things I\u2019d rather be doing instead. Some of these interests are just me reacting to having no spare time for a few months, but others are decent goals and projects. Temporaril losing my spare time made me value it\u00a0more. I wrote this post so that I could compare what I thought was important when I was busy to when I wasn\u2019t. Writing the list allowed me to move on without forgetting Here\u2019s a break down of each\u00a0item: \u201cRun 10k in 50 minutes\u201d - This is easier than I imagined. When I wrote the list I could barely run 2k without stopping, but intended to go for a weekly run whilst studying. I\u2019ve struggled to stay energetic and healthy during previous exam phases, so now I have a rule that I must do some mild exercise even if I think I can\u2019t spare the time. I find running really lowers my stress levels, and increasing my heart rate and working up a sweat lets me sleep better and concentrat for longer. After a few weeks of minimal running, I could run 5k easily. If I kept at it, 10k in 50 mins would be easy. But point\u00a02\u2026 \u201cBe Insanity strong\u201d - As in, do the Insanity workout programme. Again. I thought about it and decided to do P90x instead. Week 1 is going\u00a0grea \u201cRead the bible habitually - When I was 17 I found out about Jesus and became a Christian. In the months and years after that I\u2019d regularly read my bible almost every day and pray a lot. I wanted to understand so I set a goal of 5 chapters each\u00a0day. I read through the new testament repeatedly and read most of the old testament during this time. I knew the scriptures well enough that a lot of sermons became boring and obvious - I\u2019d already studied the bible passage being used. During this time I remember being aware that the way I thought was different to how it would have been otherwise. My perspectiv were long-term and less me focused. I thought about what I was reading instead of the days\u2019 headlines or social chatter. I remember enjoying the benefits and thinking that I should keep this habit. The reasons why I think Christiani is so wonderful are well summarised in this sermon. It\u2019s now 10 years later, and whilst my conviction are strong, my knowledge of the bible is sadly pretty fuzzy. My thinking is clouded by the perspectiv contained in the media I consume and the conversati I\u2019m part of. I strongly suspect I would be acting and thinking differentl if I read my bible more, but I don\u2019t know what those difference would be. Trouble is, it\u2019s often not obvious what the immediate benefits of reading the bible are, you have to work for it a bit. Praying for help is effective. The books in the bible were written to last through the ages and across all cultures, so it\u2019s not surprising that they\u2019re not as easy to read as something written for an English speaker in a hurry. I should stop being in a hurry, and stop prioritisi only \u201cThink more\u201d - On a similar note to 3, but less supernatur If I spent less time consuming content and a little more looking around me or walking, I reckon I\u2019d be more self-aware and make better decisions. This would probably lead to a happier, more \u201cPray more\u201d - I don\u2019t know why, but the creator of the universe wants me to talk and share my life with him. He cares about me. This makes no sense to me, at all. If I was God, I would not go out of my way to consider the views and concerns of a very flawed human. But when I pray, my prayers are very often answered. I should ask him about\u00a0this \u201cAmazon seller?\u201d - Amazon have this \u201cFulfilled by Amazon\u201d service, which means you don\u2019t even need to hold the stock you want to sell. If I choose the right products, import them cheaply from China and reselling them on Amazon at a profit, I end up quids in for minimal effort and manageable risk. \u201cBlog about interestin data\u201d - Here I am, blogging. I should stick to the main topic and get technical. I was inspired by this blog in\u00a0particu \u201cFinish Coursera\u201d - The data science specializa is great! It\u2019s in R, and I want to focus on Python, but I\u2019d still like to do it. I need to consider the opportunit cost of the time \u201cRead for fun, history, fiction\u201d - When I started my job, in April 2014, I was half way through Savage Continent, which is an eye-openin and eye-wateri history of Europe in the years after World War 2. I see Europe through different eyes because of it. However I only got half way though, and since April \u201814 I never felt I had the free time or energy to pick it back up. This should change. When I was a researcher and when I was a student, I had so much more opportunit to develop my own pursuits. Since entering the corporate world, I find myself fighting a war of attrition to exert my personalit onto my\u00a0lifesty Read for fiction\u2026 I\u2019m unconvince What do you have at the end of it? What can you do with it? Maybe I\u2019ve just been reading the wrong authors, but I\u2019m going to leave this for now. Sure, you could gain an appreciati of a different time or place, but that appreciati comes via the fictional characters and events, its secondary. What about abstract constructs perspectiv morals\u2026 things that history books are ill-suited for? Great fiction would be essential for exploring these. But for now, I\u2019ll prioritise point\u00a03. \u201cDo a photograph project\u201d - This I would love to do, but probably won\u2019t. It would be a luxury, and the opportunit cost would be too great right now. I\u2019d like to shoot a series of black and white portraits, and turn them into large prints. I think good portrait photograph is uniquely impactful and moving, choosing B&W removes distractio and leaves a subjects humanity more\u00a0expos \u201cHave a list of ideas\u201d - There\u2019s no excuse for this one, anyone can have several good ideas. It\u2019s turning them into reality that takes skill. Need to have the ideas first,\u00a0tho \u201cDo a law MOOC\u201d - i.e Study particular areas of law, in my own time and at my own pace. I studied a tiny bit of law during the ACA, and realised employment law or contract law could be really useful. (Same for the tax system - another surprise). We only had a brief introducti though, so if I could find the time I\u2019d like to know\u00a0more. \u201cDo an InfoSec or Network Security MOOC\u201d - Its super interestin but not likely to be a good use of\u00a0time. \u201cLearn to fight - Krav Maga / MMA\u201d - Ever since watching The Bourne Identity I\u2019ve wanted to learn Krav Maga, and Georges St-Pierre made me want to train for MMA. For now though, I\u2019ll do P90x. I can reconsider in 90\u00a0days. \u201cGet out of London\u201d - My contract ends in April 2017, next summer will be a crossroads I hate the commute, I hate being constantly rushed. Living in other cities has been a lot more\u00a0pleas"},{"title":"\u00dcbersicht widget: Time\u00a0Until","category":"Technical/Web","url":"time-until.html","date":"7 August 2016","tags":"applescript, coffeescript, javascript, time until, \u00fcbersicht, widgets ","body":"In a previous post I described how I was introduced to CoffeeScri via \u00dcbersicht, the desktop widget app for OS X, and eventually published the \u201cTime Since\u201d\u00a0wid Seeing a few people download the widget was really satisfying and I was soon wondering what else I could publish. As a pragmatic engineer, it seemed clear to me what the next widget should do: If my first widget calculated the time since an event, the next should calculate the time until an event. I set out to create the companion to \u201cTime Since\u201d and improve upon the My first code design choice was to cut out the use of an Apple Script and calculated everything in one CoffeeScri file. It would be more efficient and easier to\u00a0read. Unfortunat I soon began to realise why the original widget I\u2019d based \u201cTime Since\u201d on had used AppleScrip to calculate the time elapsed. Date-Times are fiddly to work with in many languages, and this is true in My code began to look increasing like spaghetti as I tried to achieve 6 key\u00a0featur Calculate the number of months and days between two dates (complicat by the varying number of days in Add the option to specify the level of detail in the output text (to the minute, to the hour, to the day,\u00a0etc.) Remove any 0 amounts from the output (\u201c2 Months and 5 Hours\u201d not \u201c2 Months, 0 Days and 5\u00a0Hours\u201d) Add the option to abbreviate times (\u2018years\u2019 \u2192 \u2018y\u2019, \u2018hours\u2019 \u2192 \u2018h\u2019, \u2018and\u2019 \u2192 \u2018&\u2019,\u00a0etc.) Output a grammatica correct sentence (correct pluralisat and comma separated terms, with an \u201cand\u201d between the last two\u00a0terms) Prepend and append users After a few frustratin hours, it was clear that it would be a lot easier to modify the existing AppleScrip rather than reinvent the wheel in JavaScript I decided to use it instead of using only JavaScript as I knew the AppleScrip file could correctly consider the number of days in the months between the 2 dates (feature 1), and it contains a function to pluralise the correct parts of the string (feature\u00a05 The remaining features were added by using a combinatio of if statements and arrays. The if statements simply ask if an amount is equal to zero. If not, then it\u2019s appended to an array of terms to include in the output. At the end of this code chunk it\u2019s possible to ask how many non-zero terms to include in the output. An array with length one less than the number of output terms is created and filled with commas with an \u2018and\u2019 in the The two arrays are combined in the correct order by looping through the index of the longer array and appending each term to a final array. The final array is the\u00a0output \u2018Time Until\u2018 can be downloaded from the \u00dcbersicht widgets gallery. I think another useful feature would be an option to specify the output only in terms of a chosen amount, such as weeks or days. Maybe I\u2019ll do that in the\u00a0future"},{"title":"How to wake up\u00a0early","category":"Non-technical/Journal","url":"how-to-wake-up-early.html","date":"6 August 2016","tags":"productivity, sleep ","body":"For years I\u2019ve wished that I could wake up early and use the quiet pre-breakf hours for productivi And for years I have spectacula failed at this. I love the quiet, hours at the end of the day too, as well as waking up slowly.1 Many famous leaders and politician are known to start their days early, and I would like to be able to do this too. After years of trying and failing, I have had The secret of waking up early is\u00a0this: Become a\u00a0parent. After the first couple of months with a\u00a0new-born You will be well-pract at quickly waking up at previously You will be used to operating on less sleep than you ever Your beautiful child will become a reliable alarm clock, waking you up at the crack of dawn with cute smiles and increasing insistent demands that you get up, play and If you have the freedom to begin sleeping and working when you want, I still believe this is a great option. I bashed out a PhD in 3.5 years during which I usually woke up late and began to work around lunch time! - Glorious autonomy!\u00a0"},{"title":"Jupyter (iPython) notebooks +\u00a0Pandas","category":"Technical/Developer Tools","url":"Jupyter-ipython-notebooks-pandas.html","date":"3 August 2016","tags":"data, jupyter notebook, pandas, python ","body":"When working with more data than can fit in an Excel file, or when you want to be sure the data won\u2019t be edited, you usually need to interact with the data by One of the biggest time sinks (for me) when working with these tools (ACL, SQL, Python) is debugging, and working out exactly where in the chain of individual commands something unexpected happened. Even with only a modest page of code, I can quickly find myself rerunning the entire script multiple times and commenting and uncommenti multiple lines in order to understand what\u2019s really going on. If you have a time consuming task at the start of your script, such as a summarise and sort command, the extra time required can be even greater. This leads to interrupte flow Pandas is a python package to manipulate large datasets, the Jupyter notebook is an applicatio which allows the user to run a python script in chunks, and output the results of each chunk before continuing You can re-run a previous chunk without returning to the beginning, and change the code as you go along. This is amazingly flexible and\u00a0intuit I recently worked through an exceptiona good Pandas tutorial recorded at PyCon 2015. \u201cPandas from the ground up\u201d is well structured clear, has good scope and the resources are available to download from github. Brandon Rhodes gives you a good working foundation for using Pandas and the Jupyter notebook to manipulate datasets using\u00a0Pyth"},{"title":"Coursera\u2019s \u201cData Science\u00a0Specialisation\u201d","category":"Technical/Data","url":"courseras-data-science-specialisation.html","date":"1 August 2016","tags":"coursera, data science, r ","body":"Last year I decided to learn the tools required to work as a data scientist. I was confident I already had the mathematic and analysis skills I needed, but I wasn\u2019t familiar with the tools of the\u00a0trade. Some googling brought me to Coursera, and the Data Science Specialisa run by Johns Hopkins University It consists of 9 courses, and so far I\u2019ve completed five. If you do the courses in order then prior knowledge isn\u2019t required, and I think the courses strike a good balance of brevity and\u00a0depth. The main downside to me is that the courses exclusivel use R (which is popular in academia) and I would rather be using Python (which is more popular in\u00a0Industr Each course lasts about 3 weeks and deals with a specific aspect of data science, such as statistica inference or machine learning. Key concepts and tools in each subject are explained and developed, and whilst it\u2019s not as thorough as a longer course would be, there is more than enough material packed into the lectures, quizzes, assignment and projects to apply to I\u2019ve read that the second half of the specialisa is a lot more technical than the first, so I\u2019m looking forward to setting aside some time, working through the assignment and acquiring some useful\u00a0ski"},{"title":"\u00dcbersicht widget: Time\u00a0Since","category":"Technical/Web","url":"ubersicht-widget-time-since.html","date":"30 July 2016","tags":"coffeescript, javascript, time since, \u00fcbersicht, widgets, applescript, time until ","body":"\u00dcbersicht is a desktop widgets app for OS X. Its free, open source, and has a pretty good widgets library to download and play with. A widget is a small app or feature that embeds into the desktop and displays some simple informatio It can tell you what song is currently playing, a weather forecast, disk space remaining, etc. The widgets are written in CoffeeScri which is a variant of\u00a0JavaScr When I started using \u00dcbersicht I began playing with the widgets, changing their appearance and their position on the screen. Some of the widgets are too complicate to mess with without specific programmin knowledge, but others are surprising simple and\u00a0intuit By trial and error, I began to customize widgets to my preference One widget I wanted to have but couldn\u2019t simply download was a timer to tell me exactly how much time had elapsed since a specific past\u00a0event By combining the display attributes of one widget and the calculatio method of another, I was able to mash together a foundation for a new widget. I then added some extra Optional text before and after the elapsed\u00a0ti Choice of expanded or abbreviate display\u00a0st Flexible formatting to remove and zero\u00a0amoun The widget is called \u201cTime Since\u201d and is in the \u00dcbersicht widgets gallery."}]