Compare commits
5 Commits
resume-cal
...
rbc-aml
| Author | SHA1 | Date | |
|---|---|---|---|
|
85017c2ddc
|
|||
|
a79019fc9f
|
|||
|
69b3c99e6f
|
|||
|
6673b67cd8
|
|||
| 00dd6b77e9 |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -4,7 +4,7 @@ resources/_gen/
|
|||||||
themes/base16*
|
themes/base16*
|
||||||
|
|
||||||
*.pdf
|
*.pdf
|
||||||
*p*.md
|
*pt*
|
||||||
|
|
||||||
commit-msg.txt
|
commit-msg.txt
|
||||||
.hugo_build.lock
|
.hugo_build.lock
|
||||||
|
|||||||
@@ -104,7 +104,7 @@ body {
|
|||||||
background-color: $background-color;
|
background-color: $background-color;
|
||||||
color: $color;
|
color: $color;
|
||||||
// line-height: 1.5;
|
// line-height: 1.5;
|
||||||
line-height: 1.59;
|
line-height: 1.57;
|
||||||
// font-size: 100%;
|
// font-size: 100%;
|
||||||
// font-size: 15px;
|
// font-size: 15px;
|
||||||
font-size: 17px;
|
font-size: 17px;
|
||||||
@@ -578,7 +578,7 @@ header {// {{{
|
|||||||
// }
|
// }
|
||||||
}// }}}
|
}// }}}
|
||||||
h2 {// {{{
|
h2 {// {{{
|
||||||
//color: $base-orange;
|
color: $base-orange;
|
||||||
margin-top: .5rem;
|
margin-top: .5rem;
|
||||||
font-size: 1em;
|
font-size: 1em;
|
||||||
|
|
||||||
@@ -639,25 +639,15 @@ header {// {{{
|
|||||||
}
|
}
|
||||||
|
|
||||||
.project-header {
|
.project-header {
|
||||||
display: flex;
|
// margin-bottom: .6em;
|
||||||
align-items: baseline;
|
// margin-bottom: .1em;
|
||||||
justify-content: space-between;
|
|
||||||
margin-bottom: 5px;
|
margin-bottom: 5px;
|
||||||
}
|
}
|
||||||
|
|
||||||
.project-date {
|
|
||||||
margin-left: 1em;
|
|
||||||
}
|
|
||||||
|
|
||||||
.project-title {
|
.project-title {
|
||||||
|
// color: $base-blue;
|
||||||
|
color: black;
|
||||||
display: inline;
|
display: inline;
|
||||||
margin-right: 0.5em;
|
|
||||||
}
|
|
||||||
|
|
||||||
.project-title span {
|
|
||||||
display: inline;
|
|
||||||
margin-left: 0.5em;
|
|
||||||
font-weight: normal;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
.project-link {
|
.project-link {
|
||||||
@@ -719,10 +709,8 @@ header {// {{{
|
|||||||
}
|
}
|
||||||
|
|
||||||
.languages {
|
.languages {
|
||||||
font-weight: normal;
|
// font-style: italic;
|
||||||
font-style: normal;
|
// font-size: .9em;
|
||||||
margin-left: 0.5em;
|
|
||||||
color: $base03;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
.institution {
|
.institution {
|
||||||
@@ -752,8 +740,6 @@ header {// {{{
|
|||||||
|
|
||||||
&.letter {
|
&.letter {
|
||||||
margin-top: 2em;
|
margin-top: 2em;
|
||||||
margin-left: 2em;
|
|
||||||
margin-right: 2em;
|
|
||||||
line-height: 1.5em;
|
line-height: 1.5em;
|
||||||
|
|
||||||
img {
|
img {
|
||||||
@@ -768,8 +754,6 @@ header {// {{{
|
|||||||
|
|
||||||
p {
|
p {
|
||||||
margin-bottom: 1em;
|
margin-bottom: 1em;
|
||||||
font-size: 25px;
|
|
||||||
line-height: 1.5em;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
.no-line-spacing {
|
.no-line-spacing {
|
||||||
@@ -1196,8 +1180,3 @@ pre { background: #2d2d2d; color: #f2f0ec }
|
|||||||
|
|
||||||
// }}} Pygments //
|
// }}} Pygments //
|
||||||
|
|
||||||
@media print {
|
|
||||||
.resume, body { font-size: 13pt; line-height: 1.5; }
|
|
||||||
.resume li { margin-bottom: 2pt !important; }
|
|
||||||
.resume h2, .resume .section__title { margin: 6pt 0 2pt !important; }
|
|
||||||
}
|
|
||||||
|
|||||||
@@ -3,59 +3,129 @@ title: "Resume"
|
|||||||
date: 2019-02-11T07:50:51-05:00
|
date: 2019-02-11T07:50:51-05:00
|
||||||
draft: false
|
draft: false
|
||||||
---
|
---
|
||||||
|
{{% resume/section projects %}}<!--- {{{ -->
|
||||||
|
|
||||||
{{% resume/section "Summary" %}}
|
<!--- RBC AML {{{ -->
|
||||||
Customer-focused call centre professional with Tier 1/2 support experience, de-escalation, and clear communication. Improves first-response, reduces escalations, and shortens resolution times across high-volume phone/chat/email queues. Strong documentation habits and plain-language explanations for non-technical users.
|
|
||||||
{{% /resume/section %}}
|
|
||||||
|
|
||||||
{{% resume/section "Work Experience" %}}
|
{{% resume/project name="AML Risk Analytics"
|
||||||
|
languages="Python, SQL, Tableau"
|
||||||
|
date="July 2025" show="true" %}}
|
||||||
|
|
||||||
{{% resume/work-experience
|
* Built an end-to-end AML simulation using **Python**,
|
||||||
name="Digital Goods Marketplace"
|
generating **9M+ records** across customers,
|
||||||
title="Owner–Operator (Customer Support & Sales)"
|
transactions, and alerts to mimic real-world
|
||||||
languages="Live Chat Support, Dispute Resolution, Sales Negotiation"
|
financial behavior and suspicious activity patterns.
|
||||||
date="July 2025 — Present"
|
* Wrote advanced **SQL (CTEs + joins)** to classify
|
||||||
%}}
|
**high-risk customers**, calculate alert counts, and
|
||||||
|
filter transactions over the past 90 days with
|
||||||
|
aggregated metrics.
|
||||||
|
* Engineered a **risk scoring model** in Python
|
||||||
|
using transaction thresholds and alert volume to
|
||||||
|
classify customers as Elevated or Critical risk.
|
||||||
|
* Designed **interactive Tableau dashboards** (Risk
|
||||||
|
Heatmap, Alert Efficiency, Risk vs. Avg Amount) to
|
||||||
|
visualize cross-country AML exposure and alert
|
||||||
|
effectiveness.
|
||||||
|
- **Developed KPI-ready metrics** (alert rate, avg USD
|
||||||
|
exposure, transaction volume) to drive AML
|
||||||
|
performance reporting and enable cross-country risk
|
||||||
|
comparisons.
|
||||||
|
- **Normalized multi-currency transaction data** to
|
||||||
|
ensure consistent exposure calculations across USD,
|
||||||
|
CAD, and EUR, supporting reliable AML metric
|
||||||
|
aggregation.
|
||||||
|
|
||||||
- Built and managed a **peer-to-peer e-commerce operation** reselling digital items; exceeded **$50,000+ gross merchandise
|
|
||||||
value**.
|
|
||||||
- Closed transactions and **middlemanned for high-value trades exceeding $5,000+ deals)** with **250+ verified vouches**, maintaining **5-star satisfaction** and **zero unresolved disputes**.
|
|
||||||
- Handled **end-to-end operations and escrow**: sourcing, pricing, inventory, listings, secure payments and fulfillment.
|
|
||||||
- Implemented **fair-value pricing** and **bundle offers** to accelerate turnover and improve margins while reducing low-value inquiries.
|
|
||||||
- Standardized **ownership verification and middleman workflows** to mitigate **fraud/chargeback** risk on large trades.
|
|
||||||
- Tracked **P&L and cash flow**; reconciled payments and maintained records for auditability.
|
|
||||||
|
|
||||||
{{% resume/work-experience
|
|
||||||
name="Red Hat"
|
|
||||||
title="Technical Support Engineer Intern (Tier 1/2)"
|
|
||||||
languages="Ticketing/Triage, De-escalation, Knowledge Base Writing"
|
|
||||||
date="Aug 2022 — Aug 2024"
|
|
||||||
%}}
|
|
||||||
|
|
||||||
- Delivered **Tier 1/2 frontline support** for CI/CD and Kubernetes issues via a ticket queue, improving **first-response time by 40%** through better triage and routing.
|
|
||||||
- Performed **incident troubleshooting and root-cause analysis**; automated data capture/validation that resolved **80% of config errors** and **reduced downtime by 40%**.
|
|
||||||
- Wrote **clear, step-by-step knowledge-base articles** and troubleshooting flows that enabled Tier 1 to solve common probe issues, **cutting escalations by 30%**.
|
|
||||||
- Built a deployment **runbook** that standardized fixes and **reduced repeat contacts/tickets by 66%**; **shortened resolution time from 45 → 15 minutes**.
|
|
||||||
- Kept users informed with **concise status updates**, set expectations, and **de-escalated frustrated stakeholders** by focusing on next steps and time to resolution.
|
|
||||||
- Partnered with QA/DevOps to capture **root causes** of startup failures; implemented dynamic probes that **cut production launch issues by 50%**.
|
|
||||||
|
|
||||||
{{% /resume/section %}}
|
|
||||||
|
|
||||||
{{% resume/section "Web Dev Projects" %}}
|
|
||||||
{{% resume/project name="Rarity Surf" languages="User Support, Bug Reproduction" date="March 2024 — Dec 2024" show="true" %}}
|
|
||||||
- Reproduced user-reported issues; wrote **concise repro steps** and a **known-issues + workarounds** note to reduce repeat questions.
|
|
||||||
- Partnered with devs to **prioritize fixes** from impact-driven triage and shipped **onboarding/troubleshooting snippets** that cut new-user setup pings, improved first-contact resolution, and kept user-facing notes up to date.
|
|
||||||
{{% /resume/project %}}
|
{{% /resume/project %}}
|
||||||
{{% /resume/section %}}
|
|
||||||
|
|
||||||
{{% resume/section "Skills" %}}
|
<!--- RBC AML }}} -->
|
||||||
- **Customer Support & Call Centre:** Active listening, empathy, de-escalation, clear written/verbal comms, ticket triage/prioritization, SLA awareness, call/chat/email etiquette, documentation & KB writing
|
|
||||||
- **Technical:** Microsoft 365, VPN/log basics, Linux basics
|
<!--- Spotify Visualized {{{ -->
|
||||||
- **Languages:** English; **Cantonese (fluent)**
|
|
||||||
{{% /resume/section %}}
|
{{% resume/project name="Spotify Visualized"
|
||||||
|
url="https://github.com/Kevin-Mok/astronofty" languages="Python, Django" date="June 2023"
|
||||||
|
show="true" %}}
|
||||||
|
|
||||||
|
- **Built a high-performance Python backend** using
|
||||||
|
Django and PostgreSQL to process 10K+ data records
|
||||||
|
per user, optimizing ingestion pipelines via API
|
||||||
|
integration and ORM modeling.
|
||||||
|
- **Engineered normalized database schemas** to
|
||||||
|
streamline query workflows, achieving a **50%
|
||||||
|
reduction in PostgreSQL latency** for high-volume
|
||||||
|
reporting tasks.
|
||||||
|
- **Visualized user music libraries in Tableau**,
|
||||||
|
creating dashboards that grouped tracks by **artist
|
||||||
|
and genre**, enabling users to explore listening
|
||||||
|
patterns and discover trends in their Spotify data.
|
||||||
|
|
||||||
|
{{% /resume/project %}}
|
||||||
|
|
||||||
|
<!--- Spotify Visualized }}} -->
|
||||||
|
|
||||||
|
<!--- Rarity Surf {{{ -->
|
||||||
|
|
||||||
|
{{% resume/project name="Rarity Surf"
|
||||||
|
languages="Python, Django, JavaScript, React"
|
||||||
|
date="Oct 2022" show="true" %}}
|
||||||
|
|
||||||
|
- **Built a full-stack reporting tool** using React,
|
||||||
|
Django, and **PostgreSQL** to analyze
|
||||||
|
structured/unstructured metadata from APIs, enabling
|
||||||
|
real-time rarity scoring and improving insight
|
||||||
|
delivery by **80%**.
|
||||||
|
- **Optimized SQL query performance** within a
|
||||||
|
Django-based pipeline, processing NFT ranking data at
|
||||||
|
scale and exposing results via GraphQL with
|
||||||
|
**low-latency response times under high concurrency
|
||||||
|
(2,000+ queries)**.
|
||||||
|
|
||||||
|
{{% /resume/project %}}
|
||||||
|
|
||||||
|
<!--- Rarity Surf }}} -->
|
||||||
|
|
||||||
|
{{% /resume/section %}}<!--- }}} -->
|
||||||
|
|
||||||
|
{{% resume/section "Work Experience" %}}<!--- {{{ -->
|
||||||
|
|
||||||
|
{{% resume/work-experience name="Red Hat"
|
||||||
|
title="Cloud/Software Engineer Intern" languages="Kubernetes, GoLang, Jenkins" date="May 2022 - Aug 2023" %}}
|
||||||
|
|
||||||
|
- **Decreased manual configuration errors by 80%** by
|
||||||
|
automating service discovery and dynamic config
|
||||||
|
updates, aligning with AML goals of minimizing
|
||||||
|
operational risk and improving data integrity.
|
||||||
|
- **Enhanced CI pipeline reproducibility and
|
||||||
|
performance** by rewriting the Jenkins nightly
|
||||||
|
pipeline to support automated PR-level testing with
|
||||||
|
reusable parameters, improving report consistency
|
||||||
|
across environments.
|
||||||
|
- **Collaborated cross-functionally** with developers
|
||||||
|
and testers to maintain reliable infrastructure,
|
||||||
|
echoing the AML role's emphasis on stakeholder
|
||||||
|
partnership for building robust reporting systems.
|
||||||
|
- **Improved system reliability** during production
|
||||||
|
launches by implementing startup probes for legacy
|
||||||
|
services, reducing downtime and enhancing stability
|
||||||
|
for automated monitoring/reporting pipelines.
|
||||||
|
- **Reduced reporting deployment time by 66%** by
|
||||||
|
building a CLI-based solution to push compiled
|
||||||
|
binaries directly into Kubernetes/Openshift clusters,
|
||||||
|
accelerating turnaround for testing and data
|
||||||
|
validation workflows.
|
||||||
|
|
||||||
|
{{% /resume/section %}}<!--- }}} -->
|
||||||
|
|
||||||
|
{{% resume/section skills %}}<!--- {{{ -->
|
||||||
|
|
||||||
|
**Python**, **SQL**, **PostgreSQL**, **Tableau**, **MongoDB**, **JavaScript**, Django, **React**, Bash, **Git**, **Linux**, **Command Line**, Go(Lang), AWS, Kubernetes, Terraform, Docker (Compose), Jenkins, Groovy, Solidity, C
|
||||||
|
|
||||||
|
{{% /resume/section %}}<!--- }}} -->
|
||||||
|
|
||||||
|
{{% resume/section education %}}<!--- {{{ -->
|
||||||
|
|
||||||
{{% resume/section "Education" %}}
|
|
||||||
{{% resume/education name="University of Toronto (St. George)"
|
{{% resume/education name="University of Toronto (St. George)"
|
||||||
title="Computer Science Specialist — 3.84 GPA. Graduated with High Distinction."
|
title="Computer Science Specialist - 3.84 GPA (CS). Graduated with High Distinction." date="2019 - 2024" %}}
|
||||||
date="2020 — 2025" %}}
|
|
||||||
{{% /resume/section %}}
|
{{% /resume/section %}}<!--- }}} -->
|
||||||
|
|
||||||
|
<!-- vim: fdm=marker -->
|
||||||
|
|||||||
@@ -1,39 +0,0 @@
|
|||||||
# ME Sniper
|
|
||||||
write me a resume section similar to this (just a bit longer) for a web dev resume based on the points after with made up statistics
|
|
||||||
|
|
||||||
## Old
|
|
||||||
- **Developed a full-stack web application** to generate rarity
|
|
||||||
rankings for NFT's integrated with leading NFT
|
|
||||||
marketplace's (OpenSea) API,
|
|
||||||
enabling users to **quickly identify rare NFT's** and check
|
|
||||||
their listing status, **improving market research efficiency by 80%**.
|
|
||||||
- **Architected a robust Django (Python) [backend](https://github.com/Kevin-Mok/rarity-surf)** to fetch and process
|
|
||||||
NFT metadata from IPFS, store rarity rankings in
|
|
||||||
**PostgreSQL**, and serve the data via GraphQL API, **ensuring low-latency access and scaling to handle 2,000+ concurrent requests**.
|
|
||||||
- **Developed a dynamic React (Javascript)
|
|
||||||
[frontend](https://github.com/Kevin-Mok/rarity-surf-frontend)** using hooks to load
|
|
||||||
rarity data in real-time, styled with Tailwind for
|
|
||||||
mobile responsiveness, **improving user experience
|
|
||||||
and reducing frontend load times by 70%**.
|
|
||||||
|
|
||||||
## New
|
|
||||||
- Developed a full-stack web application to generate rarity rankings for NFT’s integrated with leading NFT marketplace’s (Magic
|
|
||||||
Eden) API, enabling users to quickly identify rare NFT’s and check their listing status, improving market research efficiency by 80%.
|
|
||||||
- fetch metadata from either IPFS or website in parallel processes to create rarity
|
|
||||||
rankings as soon as metadata revealed
|
|
||||||
- reverse engineered algorithm for rarity rankings for NFT's based on article from
|
|
||||||
marketplace about their in-house statistical rarity
|
|
||||||
ranking
|
|
||||||
- created Prisma schema for PostgreSQL for database to store NFT data
|
|
||||||
- Node.js backend with API endpoints to return NFT's based
|
|
||||||
on max rank/price along with rarest traits
|
|
||||||
- lowest prices for rarity percentile to see if good deal
|
|
||||||
- fetch all listings from leading marketplace (Magic Eden) to be
|
|
||||||
able to identify which rare NFT's are on sale and be able
|
|
||||||
to filter based on max price/filter
|
|
||||||
- store previous sales data to check whether a buy at rarity
|
|
||||||
percentile is a good deal
|
|
||||||
- React FE to dynamically load NFT's based on rarity
|
|
||||||
rank/price filter with ability to hide seen ones
|
|
||||||
- Discord bot to notify you when customizable profitable resale
|
|
||||||
opportunity comes up based on rarity level/price
|
|
||||||
@@ -52,41 +52,6 @@ date="Oct 2021" show="true" %}}
|
|||||||
|
|
||||||
<!--- Rarity Surf }}} -->
|
<!--- Rarity Surf }}} -->
|
||||||
|
|
||||||
<!--- Rarity Surf {{{ -->
|
|
||||||
|
|
||||||
{{% resume/project name="Rarity Surf (2)"
|
|
||||||
languages="Typescript, Node.js, React"
|
|
||||||
date="" show="true" %}}
|
|
||||||
|
|
||||||
- **Developed a full-stack web application** to generate
|
|
||||||
rarity rankings for NFT's, integrating with **leading
|
|
||||||
marketplace’s API** to enable users to quickly identify
|
|
||||||
rare NFT's and check their listing status, **improving
|
|
||||||
market research efficiency by 80%**.
|
|
||||||
- **Built a scalable Node.js backend** with REST API
|
|
||||||
endpoints to return NFTs based on customizable filters
|
|
||||||
such as max rank, price, and rarest traits. **Optimized
|
|
||||||
performance** to handle **3,000+ concurrent requests** by
|
|
||||||
implementing efficient data fetching and caching
|
|
||||||
mechanisms, ensuring low-latency access to NFT data.
|
|
||||||
- **Developed a dynamic React frontend** to load and display
|
|
||||||
NFT's in real-time based on user-defined filters to
|
|
||||||
streamline browsing. Styled the interface using **Tailwind
|
|
||||||
CSS** for a responsive and modern design, **reducing
|
|
||||||
frontend load times by 50%**.
|
|
||||||
- **Developed a Discord bot** to notify users of profitable
|
|
||||||
resale opportunities by leveraging historical sales data
|
|
||||||
to assess deal quality. This feature **increased user
|
|
||||||
engagement by 80%** and provided a seamless way for users
|
|
||||||
to stay updated on market opportunities.
|
|
||||||
- Designed and implemented a **PostgreSQL schema** for to
|
|
||||||
efficiently store NFT data, including metadata, rarity
|
|
||||||
scores, and historical sales data.
|
|
||||||
|
|
||||||
{{% /resume/project %}}
|
|
||||||
|
|
||||||
<!--- Rarity Surf }}} -->
|
|
||||||
|
|
||||||
<!--- Astronofty {{{ -->
|
<!--- Astronofty {{{ -->
|
||||||
|
|
||||||
{{% resume/project name="Astronofty"
|
{{% resume/project name="Astronofty"
|
||||||
@@ -177,6 +142,7 @@ url="https://kevin-mok.com/server/" languages="AWS, Kubernetes, Docker, Terrafor
|
|||||||
|
|
||||||
<!--- AWS 3 }}} -->
|
<!--- AWS 3 }}} -->
|
||||||
|
|
||||||
|
|
||||||
<!--- Astronofty (extended) {{{ -->
|
<!--- Astronofty (extended) {{{ -->
|
||||||
|
|
||||||
{{% resume/project name="Astronofty"
|
{{% resume/project name="Astronofty"
|
||||||
@@ -2,8 +2,9 @@
|
|||||||
<div class="row project-header">
|
<div class="row project-header">
|
||||||
<div class="col-8 text-left">
|
<div class="col-8 text-left">
|
||||||
<h2 class="project-title">
|
<h2 class="project-title">
|
||||||
{{ .Get "name" }} <span class="languages"><{{ .Get "languages" }}></span>
|
{{ .Get "name" }}
|
||||||
</h2>
|
</h2>
|
||||||
|
<span><{{ .Get "languages" }}></span>
|
||||||
</div>
|
</div>
|
||||||
<div class="col-4 text-right date">{{ .Get "date" }}</div>
|
<div class="col-4 text-right date">{{ .Get "date" }}</div>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -5,16 +5,13 @@
|
|||||||
{{ .Get "name" }}
|
{{ .Get "name" }}
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
<div class="col text-right date">
|
<div class="col text-right date">{{ .Get "date" }}</div>
|
||||||
{{ .Get "date" }}
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
|
<!-- <span class="title"> -->
|
||||||
<span class="position">
|
<span class="position">
|
||||||
{{ .Get "title" }}
|
{{ .Get "title" }}
|
||||||
</span>
|
</span>
|
||||||
{{ with .Get "languages" }}
|
|
||||||
<span class="languages">
|
<span class="languages">
|
||||||
{{ . }}
|
<{{ .Get "languages" }}>
|
||||||
</span>
|
</span>
|
||||||
{{ end }}
|
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
Submodule static/pdf updated: e4e21878ec...6d0677da34
Reference in New Issue
Block a user