APIs & Automation
Simplifying Complexity with Technology

APIs & Automation

Simplifying Complexity with Technology

APIs & Automation

Computers communicate through Application Programming Interfaces, or APIs. APIs provide a standardized way for computers to send and receive data, forming the backbone of modern internet functionality. Most websites rely on APIs to render content dynamically.

Data vendors frequently offer their information through APIs, enabling developers to automate tasks that depend on real-time or historical data.

Use Cases

Public-Facing APIs

To support the WEBs Investment platform that Tony created, he designed public-facing APIs capable of handling high-throughput requests. Leveraging modern cloud technologies like NoSQL databases, Tony built low-latency APIs that power the website and serve critical data to Wall Street banks, counterparties, and trading firms.

These APIs deliver real-time information about the ETFs launched by WEBs. For example, the latest data and historical data are both accessible, providing flexibility to end users.

Obtaining COVID Vaccines

During the early rollout of COVID vaccines, scheduling an appointment was challenging due to unpredictable availability and rapid booking.

Tony built an application that integrated with APIs from major providers like CVS, Walgreens, and grocery store chains. The app continuously monitored available appointments within a user-defined zip code and sent email notifications with booking links. Users received updates every minute until they secured an appointment. This tool was particularly helpful for elderly and at-risk individuals, ensuring hundreds of family, friends, and coworkers could get vaccinated sooner.

Trading APIs

To explore Artificial Intelligence (AI) and Machine Learning (ML) in stock market trading, Tony automated the execution of trades. Using trading APIs offered by retail brokerage firms, he developed open-source software for the ETrade and TD Ameritrade platforms (both have since been deprecated due to acquisition).

Each morning, Tony deployed an AWS EC2 server to execute trades based on his AI/ML models. He also built a performance-tracking dashboard comparing results to custom benchmarks and market indices. While the trading performance was mixed, the automation worked flawlessly.

Data Gathering

Valuable data is scattered across the internet, often provided via APIs or displayed temporarily on websites. To capture temporary data and support his blog, Tony developed automated systems to gather data from various sources.

Using open APIs like the Treasury website and the St. Louis Federal Reserve, Tony collected critical datasets. For data not offered via APIs, he implemented periodic web scraping, storing the results for future use. This automated pipeline ran nightly, with failure reports ensuring reliability, and provided the necessary data for his dashboards and articles.