Compare commits
5 Commits
resume-web
...
rbc-aml
| Author | SHA1 | Date | |
|---|---|---|---|
|
85017c2ddc
|
|||
|
a79019fc9f
|
|||
|
69b3c99e6f
|
|||
|
6673b67cd8
|
|||
| 00dd6b77e9 |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -4,7 +4,7 @@ resources/_gen/
|
||||
themes/base16*
|
||||
|
||||
*.pdf
|
||||
*t.md
|
||||
*pt*
|
||||
|
||||
commit-msg.txt
|
||||
.hugo_build.lock
|
||||
|
||||
@@ -578,7 +578,7 @@ header {// {{{
|
||||
// }
|
||||
}// }}}
|
||||
h2 {// {{{
|
||||
//color: $base-orange;
|
||||
color: $base-orange;
|
||||
margin-top: .5rem;
|
||||
font-size: 1em;
|
||||
|
||||
@@ -639,25 +639,15 @@ header {// {{{
|
||||
}
|
||||
|
||||
.project-header {
|
||||
display: flex;
|
||||
align-items: baseline;
|
||||
justify-content: space-between;
|
||||
// margin-bottom: .6em;
|
||||
// margin-bottom: .1em;
|
||||
margin-bottom: 5px;
|
||||
}
|
||||
|
||||
.project-date {
|
||||
margin-left: 1em;
|
||||
}
|
||||
|
||||
.project-title {
|
||||
// color: $base-blue;
|
||||
color: black;
|
||||
display: inline;
|
||||
margin-right: 0.5em;
|
||||
}
|
||||
|
||||
.project-title span {
|
||||
display: inline;
|
||||
margin-left: 0.5em;
|
||||
font-weight: normal;
|
||||
}
|
||||
|
||||
.project-link {
|
||||
@@ -719,10 +709,8 @@ header {// {{{
|
||||
}
|
||||
|
||||
.languages {
|
||||
font-weight: normal;
|
||||
font-style: normal;
|
||||
margin-left: 0.5em;
|
||||
color: $base03;
|
||||
// font-style: italic;
|
||||
// font-size: .9em;
|
||||
}
|
||||
|
||||
.institution {
|
||||
|
||||
@@ -3,119 +3,128 @@ title: "Resume"
|
||||
date: 2019-02-11T07:50:51-05:00
|
||||
draft: false
|
||||
---
|
||||
{{% resume/section "Web Dev Projects" %}}<!--- {{{ -->
|
||||
{{% resume/section projects %}}<!--- {{{ -->
|
||||
|
||||
<!--- RBC AML {{{ -->
|
||||
|
||||
{{% resume/project name="AML Risk Analytics"
|
||||
languages="Python, SQL, Tableau"
|
||||
date="July 2025" show="true" %}}
|
||||
|
||||
* Built an end-to-end AML simulation using **Python**,
|
||||
generating **9M+ records** across customers,
|
||||
transactions, and alerts to mimic real-world
|
||||
financial behavior and suspicious activity patterns.
|
||||
* Wrote advanced **SQL (CTEs + joins)** to classify
|
||||
**high-risk customers**, calculate alert counts, and
|
||||
filter transactions over the past 90 days with
|
||||
aggregated metrics.
|
||||
* Engineered a **risk scoring model** in Python
|
||||
using transaction thresholds and alert volume to
|
||||
classify customers as Elevated or Critical risk.
|
||||
* Designed **interactive Tableau dashboards** (Risk
|
||||
Heatmap, Alert Efficiency, Risk vs. Avg Amount) to
|
||||
visualize cross-country AML exposure and alert
|
||||
effectiveness.
|
||||
- **Developed KPI-ready metrics** (alert rate, avg USD
|
||||
exposure, transaction volume) to drive AML
|
||||
performance reporting and enable cross-country risk
|
||||
comparisons.
|
||||
- **Normalized multi-currency transaction data** to
|
||||
ensure consistent exposure calculations across USD,
|
||||
CAD, and EUR, supporting reliable AML metric
|
||||
aggregation.
|
||||
|
||||
{{% /resume/project %}}
|
||||
|
||||
<!--- RBC AML }}} -->
|
||||
|
||||
<!--- Spotify Visualized {{{ -->
|
||||
|
||||
{{% resume/project name="Spotify Visualized"
|
||||
url="https://github.com/Kevin-Mok/astronofty" languages="Python, Django" date="June 2023"
|
||||
show="true" %}}
|
||||
|
||||
- **Built a high-performance Python backend** using
|
||||
Django and PostgreSQL to process 10K+ data records
|
||||
per user, optimizing ingestion pipelines via API
|
||||
integration and ORM modeling.
|
||||
- **Engineered normalized database schemas** to
|
||||
streamline query workflows, achieving a **50%
|
||||
reduction in PostgreSQL latency** for high-volume
|
||||
reporting tasks.
|
||||
- **Visualized user music libraries in Tableau**,
|
||||
creating dashboards that grouped tracks by **artist
|
||||
and genre**, enabling users to explore listening
|
||||
patterns and discover trends in their Spotify data.
|
||||
|
||||
{{% /resume/project %}}
|
||||
|
||||
<!--- Spotify Visualized }}} -->
|
||||
|
||||
<!--- Rarity Surf {{{ -->
|
||||
|
||||
{{% resume/project name="Rarity Surf"
|
||||
languages="TypeScript, JavaScript, Node.js, React"
|
||||
date="March 2025" show="true" %}}
|
||||
languages="Python, Django, JavaScript, React"
|
||||
date="Oct 2022" show="true" %}}
|
||||
|
||||
- **Developed a full-stack web application
|
||||
(TypeScript/JavaScript)** to generate
|
||||
rarity rankings for NFT's, integrating with **leading
|
||||
marketplace's API** to enable users to quickly identify
|
||||
rare NFT's and check their listing status, **improving
|
||||
market research efficiency by 80%**.
|
||||
- **Built a scalable [Node.js backend](https://github.com/Rarity-Surf/ME-sniper-backend)** with REST API
|
||||
endpoints to return NFTs based on customizable filters
|
||||
such as max rank, price, and rarest traits. **Optimized
|
||||
performance** to handle **3,000+ concurrent requests** by
|
||||
implementing efficient data fetching and caching
|
||||
mechanisms using **PostgreSQL** , ensuring low-latency
|
||||
access to NFT data.
|
||||
- **Built a dynamic [React frontend](https://github.com/Rarity-Surf/ME-sniper-frontend) (TypeScript/JavaScript)** to load and display NFTs in real-time with user-defined filters. Styled
|
||||
using a mobile-responsive library, **reducing load times by 50%**.
|
||||
- **Developed a [Discord bot](https://github.com/Rarity-Surf/ME-sniper-discord-bot) (TypeScript/JavaScript/Node.js)** to notify users of profitable
|
||||
resale opportunities by leveraging historical sales data
|
||||
to assess deal quality. This feature **increased user
|
||||
engagement by 80%** and provided a seamless way for users
|
||||
to stay updated on market opportunities.
|
||||
- **Built a full-stack reporting tool** using React,
|
||||
Django, and **PostgreSQL** to analyze
|
||||
structured/unstructured metadata from APIs, enabling
|
||||
real-time rarity scoring and improving insight
|
||||
delivery by **80%**.
|
||||
- **Optimized SQL query performance** within a
|
||||
Django-based pipeline, processing NFT ranking data at
|
||||
scale and exposing results via GraphQL with
|
||||
**low-latency response times under high concurrency
|
||||
(2,000+ queries)**.
|
||||
|
||||
{{% /resume/project %}}
|
||||
|
||||
<!--- Rarity Surf }}} -->
|
||||
|
||||
<!--- {{{ Kanban -->
|
||||
|
||||
{{% resume/project name="Kanban Calendar"
|
||||
url="https://github.com/Kevin-Mok/astronofty"
|
||||
languages="TypeScript, JavaScript, React, Next.js" date="Mar 2024"
|
||||
show="true" %}}
|
||||
|
||||
- **Developed a [responsive calendar Kanban
|
||||
board](https://kanban-calendar-lake.vercel.app/)
|
||||
using Next.js, TypeScript, and Tailwind CSS**,
|
||||
featuring draggable events, smooth card-to-detail
|
||||
transitions week/day views optimized for both desktop
|
||||
and mobile.
|
||||
- **Engineered intuitive navigation and cross-device
|
||||
interactivity**, implementing swipe gestures,
|
||||
infinite horizontal scrolling (mobile), and arrow
|
||||
controls (desktop) while resolving challenges like
|
||||
drag-and-drop consistency and responsive layout
|
||||
transitions.
|
||||
|
||||
{{% /resume/project %}}
|
||||
|
||||
<!--- }}} Kanban -->
|
||||
|
||||
<!--- Astronofty {{{ -->
|
||||
|
||||
{{% resume/project name="Astronofty"
|
||||
url="https://github.com/Kevin-Mok/astronofty"
|
||||
languages="JavaScript, React, Solidity" date="Jan 2023"
|
||||
show="true" %}}
|
||||
|
||||
- **Secured [2nd place](https://devpost.com/software/astronofty) overall out of 150+ teams** at UofTHacks
|
||||
X, a 36-hour hackathon, for developing a blockchain-based
|
||||
NFT marketplace app.
|
||||
- **Built and optimized React (JavaScript) [components](https://github.com/Kevin-Mok/astronofty/tree/main/src/components)** to synchronously
|
||||
upload images and metadata to IPFS, **enhancing user engagement by 80%** during the demo.
|
||||
|
||||
{{% /resume/project %}}
|
||||
|
||||
<!--- Astronofty }}} -->
|
||||
{{% /resume/section %}}<!--- }}} -->
|
||||
|
||||
{{% resume/section "Work Experience" %}}<!--- {{{ -->
|
||||
|
||||
{{% resume/work-experience name="Red Hat"
|
||||
title="Cloud/Software Engineer Intern" languages="Kubernetes, GoLang, Jenkins" date="May 2022 — Aug 2023" %}}
|
||||
title="Cloud/Software Engineer Intern" languages="Kubernetes, GoLang, Jenkins" date="May 2022 - Aug 2023" %}}
|
||||
|
||||
- **Eliminated 80% of manual configuration errors** by enabling
|
||||
the Kubernetes operator to automatically fetch data from
|
||||
deployed services and update configurations, **deprecating
|
||||
legacy startup scripts and reducing overall startup time
|
||||
by 40%** (**Kubernetes/GoLang** used for this and three below).
|
||||
- **Reduced deployment time by 66%** by implementing a
|
||||
[solution](https://github.com/apache/incubator-kie-kogito-operator/commit/175a6356c5474f2360ccb8ae835e0b9b2d653cf1) for deploying locally-compiled binaries onto
|
||||
Kubernetes/OpenShift via command-line, **cutting average
|
||||
deployment times from 45 minutes to 15 minutes**.
|
||||
- **Improved application stability** by introducing startup
|
||||
probes for legacy applications with longer boot times,
|
||||
**resulting in a 50% reduction in startup-related failures
|
||||
and downtime during production launches**.
|
||||
- **Improved system reliability** by refactoring probes to dynamically assign default values based on YAML files, **increasing probe accuracy by 30%** and preventing misconfigurations.
|
||||
- **Increased CI pipeline efficiency** by rewriting the
|
||||
**Jenkins (Groovy)** [nightly pipeline](https://github.com/apache/incubator-kie-kogito-pipelines/commit/4c83f1aecdea2c1ba2796b79839a90d4083dce88) to run in a GitHub PR
|
||||
environment, allowing for automated testing of all
|
||||
team-submitted PRs prior to merging, **reducing manual
|
||||
intervention by 60%**.
|
||||
|
||||
{{% /resume/section %}}<!--- }}} -->
|
||||
- **Decreased manual configuration errors by 80%** by
|
||||
automating service discovery and dynamic config
|
||||
updates, aligning with AML goals of minimizing
|
||||
operational risk and improving data integrity.
|
||||
- **Enhanced CI pipeline reproducibility and
|
||||
performance** by rewriting the Jenkins nightly
|
||||
pipeline to support automated PR-level testing with
|
||||
reusable parameters, improving report consistency
|
||||
across environments.
|
||||
- **Collaborated cross-functionally** with developers
|
||||
and testers to maintain reliable infrastructure,
|
||||
echoing the AML role's emphasis on stakeholder
|
||||
partnership for building robust reporting systems.
|
||||
- **Improved system reliability** during production
|
||||
launches by implementing startup probes for legacy
|
||||
services, reducing downtime and enhancing stability
|
||||
for automated monitoring/reporting pipelines.
|
||||
- **Reduced reporting deployment time by 66%** by
|
||||
building a CLI-based solution to push compiled
|
||||
binaries directly into Kubernetes/Openshift clusters,
|
||||
accelerating turnaround for testing and data
|
||||
validation workflows.
|
||||
|
||||
{{% /resume/section %}}<!--- }}} -->
|
||||
|
||||
{{% resume/section skills %}}<!--- {{{ -->
|
||||
|
||||
**TypeScript**, **JavaScript**, **React**, **Node.js**, **Python**, **Django**, PostgreSQL, MongoDB, Bash, **Git**, **Linux**, **Command Line**, Go(Lang), AWS, Kubernetes, Terraform, Docker (Compose), Jenkins, Groovy, Solidity, C
|
||||
**Python**, **SQL**, **PostgreSQL**, **Tableau**, **MongoDB**, **JavaScript**, Django, **React**, Bash, **Git**, **Linux**, **Command Line**, Go(Lang), AWS, Kubernetes, Terraform, Docker (Compose), Jenkins, Groovy, Solidity, C
|
||||
|
||||
{{% /resume/section %}}<!--- }}} -->
|
||||
|
||||
{{% resume/section education %}}<!--- {{{ -->
|
||||
|
||||
{{% resume/education name="University of Toronto (St. George)"
|
||||
title="Computer Science Specialist — 3.84 GPA (CS). Graduated with High Distinction." date="2019 — 2024" %}}
|
||||
title="Computer Science Specialist - 3.84 GPA (CS). Graduated with High Distinction." date="2019 - 2024" %}}
|
||||
|
||||
{{% /resume/section %}}<!--- }}} -->
|
||||
|
||||
|
||||
@@ -1,39 +0,0 @@
|
||||
# ME Sniper
|
||||
write me a resume section similar to this (just a bit longer) for a web dev resume based on the points after with made up statistics
|
||||
|
||||
## Old
|
||||
- **Developed a full-stack web application** to generate rarity
|
||||
rankings for NFT's integrated with leading NFT
|
||||
marketplace's (OpenSea) API,
|
||||
enabling users to **quickly identify rare NFT's** and check
|
||||
their listing status, **improving market research efficiency by 80%**.
|
||||
- **Architected a robust Django (Python) [backend](https://github.com/Kevin-Mok/rarity-surf)** to fetch and process
|
||||
NFT metadata from IPFS, store rarity rankings in
|
||||
**PostgreSQL**, and serve the data via GraphQL API, **ensuring low-latency access and scaling to handle 2,000+ concurrent requests**.
|
||||
- **Developed a dynamic React (Javascript)
|
||||
[frontend](https://github.com/Kevin-Mok/rarity-surf-frontend)** using hooks to load
|
||||
rarity data in real-time, styled with Tailwind for
|
||||
mobile responsiveness, **improving user experience
|
||||
and reducing frontend load times by 70%**.
|
||||
|
||||
## New
|
||||
- Developed a full-stack web application to generate rarity rankings for NFT’s integrated with leading NFT marketplace’s (Magic
|
||||
Eden) API, enabling users to quickly identify rare NFT’s and check their listing status, improving market research efficiency by 80%.
|
||||
- fetch metadata from either IPFS or website in parallel processes to create rarity
|
||||
rankings as soon as metadata revealed
|
||||
- reverse engineered algorithm for rarity rankings for NFT's based on article from
|
||||
marketplace about their in-house statistical rarity
|
||||
ranking
|
||||
- created Prisma schema for PostgreSQL for database to store NFT data
|
||||
- Node.js backend with API endpoints to return NFT's based
|
||||
on max rank/price along with rarest traits
|
||||
- lowest prices for rarity percentile to see if good deal
|
||||
- fetch all listings from leading marketplace (Magic Eden) to be
|
||||
able to identify which rare NFT's are on sale and be able
|
||||
to filter based on max price/filter
|
||||
- store previous sales data to check whether a buy at rarity
|
||||
percentile is a good deal
|
||||
- React FE to dynamically load NFT's based on rarity
|
||||
rank/price filter with ability to hide seen ones
|
||||
- Discord bot to notify you when customizable profitable resale
|
||||
opportunity comes up based on rarity level/price
|
||||
@@ -52,41 +52,6 @@ date="Oct 2021" show="true" %}}
|
||||
|
||||
<!--- Rarity Surf }}} -->
|
||||
|
||||
<!--- Rarity Surf {{{ -->
|
||||
|
||||
{{% resume/project name="Rarity Surf (2)"
|
||||
languages="Typescript, Node.js, React"
|
||||
date="" show="true" %}}
|
||||
|
||||
- **Developed a full-stack web application** to generate
|
||||
rarity rankings for NFT's, integrating with **leading
|
||||
marketplace’s API** to enable users to quickly identify
|
||||
rare NFT's and check their listing status, **improving
|
||||
market research efficiency by 80%**.
|
||||
- **Built a scalable Node.js backend** with REST API
|
||||
endpoints to return NFTs based on customizable filters
|
||||
such as max rank, price, and rarest traits. **Optimized
|
||||
performance** to handle **3,000+ concurrent requests** by
|
||||
implementing efficient data fetching and caching
|
||||
mechanisms, ensuring low-latency access to NFT data.
|
||||
- **Developed a dynamic React frontend** to load and display
|
||||
NFT's in real-time based on user-defined filters to
|
||||
streamline browsing. Styled the interface using **Tailwind
|
||||
CSS** for a responsive and modern design, **reducing
|
||||
frontend load times by 50%**.
|
||||
- **Developed a Discord bot** to notify users of profitable
|
||||
resale opportunities by leveraging historical sales data
|
||||
to assess deal quality. This feature **increased user
|
||||
engagement by 80%** and provided a seamless way for users
|
||||
to stay updated on market opportunities.
|
||||
- Designed and implemented a **PostgreSQL schema** for to
|
||||
efficiently store NFT data, including metadata, rarity
|
||||
scores, and historical sales data.
|
||||
|
||||
{{% /resume/project %}}
|
||||
|
||||
<!--- Rarity Surf }}} -->
|
||||
|
||||
<!--- Astronofty {{{ -->
|
||||
|
||||
{{% resume/project name="Astronofty"
|
||||
@@ -177,6 +142,7 @@ url="https://kevin-mok.com/server/" languages="AWS, Kubernetes, Docker, Terrafor
|
||||
|
||||
<!--- AWS 3 }}} -->
|
||||
|
||||
|
||||
<!--- Astronofty (extended) {{{ -->
|
||||
|
||||
{{% resume/project name="Astronofty"
|
||||
@@ -2,8 +2,9 @@
|
||||
<div class="row project-header">
|
||||
<div class="col-8 text-left">
|
||||
<h2 class="project-title">
|
||||
{{ .Get "name" }} <span class="languages"><{{ .Get "languages" }}></span>
|
||||
{{ .Get "name" }}
|
||||
</h2>
|
||||
<span><{{ .Get "languages" }}></span>
|
||||
</div>
|
||||
<div class="col-4 text-right date">{{ .Get "date" }}</div>
|
||||
</div>
|
||||
|
||||
Reference in New Issue
Block a user