2Yr·

Dear capital allocators, dividend hunters and yield rockets!


With pleasure I present you today capers ☀ Capital (https://kapriolen.capital) 🚀


The site reflects the current status of @Gaylord s 100k Depot Challenge. Rules and participants can be found here: https://app.getquin.com/activity/zyvcPsclUx


𝗕𝘂𝘁 𝘄𝗵𝘆?

After seeing the painstaking listing (https://app.getquin.com/activity/pOTYXISoTp) of @getquin I couldn't help myself and had to automate this endeavor, just as I would in my day-to-day work. And to preface it, no Oli, Bitcoin was not considered as a solution here 🤡


Building the fun took me about 3-4 hours and the knowledge return I got from it was worth every minute. I spared cost but no effort because the savings rate should always be thought of. But before I get into the costs, a quick overview of how the whole store works technically in the first place.



𝗗𝗮𝘁𝗲𝗻𝗲𝗿𝗳𝗮𝘀𝘀𝘂𝗻𝗴 𝘂𝗻𝗱 𝗔𝗴𝗴𝗿𝗲𝗴𝗶𝗲𝗿𝘂𝗻𝗴

The very first thing to do of course is to collect the data from GetQuin, this was done using a virtual browser [1] that uses a chrome instance to visit the dashboards and collect targeted data. This entity has been creatively named Scrappy by me. Scrappy, of course, needs to know who to drop in on. For this there is a database where all participants can be found with their corresponding dashboard link from the respective challenge depot. This is done by cronjob [8] every hour from 8 a.m. to 11 p.m. and takes about 5 min. Once the data has been collected by Scrappy, it is stored in the database in a structured way so that it can be retrieved later from the website. The website's cache is then cleared via API and the updated data is stored. This way the page can load very fast and unnecessary roundtrips to the database are avoided, which would be unnecessary anyway if the data is only updated every hour.


𝗪𝗲𝗯𝘀𝗶𝘁𝗲

I did as any reasonable developer would and made use of existing elements. The interface was built using React [2] (Meta Platforms), Next.js[3] (Vercel), Ant Design, a design system [4] (Alibaba), and Typescript [5] (Microsoft). I customized the components according to my visual and structural needs and voila, the site was up....


𝗜𝗻𝗳𝗿𝗮𝘀𝘁𝗿𝘂𝗸𝘁𝘂𝗿

The infrastructure consists of the website (Vercel [6]), database (Redis [7]) and scraper [1]. Additionally, a deployment pipeline exists that is triggered via Github (Microsoft).

A deployment pipeline is nothing more than a sequence of steps to ensure and execute the smooth delivery of the software.


𝗞𝗼𝘀𝘁𝗲𝗻

Of course, since I am using my tech booths as a model, I am also acting unprofitably. The aforementioned Scrappy unit costs me about $5/m at the hosting digs Digital Ocean, where I am also invested. I see it as a contribution to boost their next earnings 🤡. The domain cost me 10€ and was purchased at Godaddy(🥵). The website is running in free plan at Vercel. For the database I also pay nothing at the provider Upstash. If one wants to provide for capers with the website I will additionally bring Cloudflare, which have a free DDOS protection plan, on board. They are used for example by Allianz, Doordash and Shopify.


𝗞𝗼𝗺𝗺𝗲𝗻 𝗱𝗮 𝗻𝗼𝗰𝗵 𝗨𝗽𝗱𝗮𝘁𝗲𝘀?

Jo but so what, you can bet your beloved dividend and boomerpunks on it.


There will be the following updates:

- Rules overview

- Statistics, such as top 5 holdings of all participants, largest single position loss/gain, dividend king

- Display of positions

- Real time updates(maybe is tricky)


So that's it from your KapitalKapriole


If you have any suggestions, feature requests or the like, let me know.


𝗙𝗲𝗲𝗱𝗯𝗮𝗰𝗸

🚀 = Bidde more nerdy tech posts of this nature.

🆘 = Don't know, does it have to?

🦍 = CapitalKellerKinder together stronk

❤ = I do not know what you babble but was quite nice :)


Now each of the participants is tagged again because I go in without attention like an unwatered plant 🌚


@getquin
@leveragegrinding
@hendrik_lmr
@Gaylord
@MontgomeryBurns
@Gally
@Pfeffiboy
@wormii
@DonkeyInvestor
@Lorena
@DankeTraderepublic
@Vic1
@TomTurboInvest_100k
@TheDividendCollector
@TheAccountant89
@TheRealDeepFuckingValue
@meta
@Landwirtin


References

[1] Puppeter - https://github.com/puppeteer/puppeteer

[2] React - https://reactjs.org/

[3] Next.js - https://nextjs.org/

[4] Ant Design - https://ant.design/

[5] Typescript - https://www.typescriptlang.org/

[6] Vercel - https://vercel.com/home

[7] Redis is a private company backed by Dell VC Fund and Goldman Sachs. - https://equityzen.com/company/redislabs/ Every tweet that you see is cached by a Redis server, and the technology is also generally

generally well received in the industry.

[8] https://www.digitalocean.com/community/tutorials/how-to-use-cron-to-automate-tasks-ubuntu-1804-de#:~:text=Cron%20ist%20ein,von%20wartungsbezogenen%20Aufgaben.

63
46 Comments

profile image
What kind of trouble did you go to? You don't even spare any costs and work like a full time employee If you don't win something at @ccf I must have misheard you My goodness, what do I love nerds 🌝
16
View all 6 further answers
profile image
Have not read but here is a rocket 🚀
4
profile image
I support my IT bre without hobbies 🚀@ccf
3
profile image
What happens when a user changes his ID? Otherwise cool thing. Job offer still stands 😎
3
View all 3 further answers
profile image
If that's not @ccf?
3
profile image
As an IT'ler this makes me as horny as it should. 🥵
3
View all 5 further answers
profile image
scrappy spies on me?👀 fun really cool thing brudi❤️❤️❤️@ccf
2
profile image
Awesome number. did you guys coordinate with LakeResources? :D
2
View all 4 further answers
Have never written anything here or commented but for the horny post and the work you get my virginity and a rocket🚀
2
Show answer

Join the conversation