Viknesh Sk

Viknesh Sk Email and Phone Number

GCP Data Architect | GCP Cloud | AWS Cloud | Composer - Apache Airflow | Dataproc - PySpark @ TEKsystems Global Services in India
Viknesh Sk's Location
Bengaluru, Karnataka, India, India
About Viknesh Sk

At Searce Inc, we've successfully deployed 200 DAGs to enhance R Scripts execution, and migrated over 700 database tables to BigQuery, demonstrating my expertise in data architecture and mastery of Google Cloud Composer. Our team has effectively harnessed the power of cloud computing to drive innovation and efficiency in the aviation industry.Previously, as an Associate Technical Architect at Quantiphi, our collaborative efforts in machine learning and data engineering pipelines materialized in significant advancements, such as the PQR-AI model and groundbreaking ICD10 code extraction. We've consistently translated complex data challenges into scalable, cloud-based solutions, cementing a reputation, for excellence in technology integration and strategic problem-solving.

Viknesh Sk's Current Company Details
TEKsystems Global Services in India

Teksystems Global Services In India

View
GCP Data Architect | GCP Cloud | AWS Cloud | Composer - Apache Airflow | Dataproc - PySpark
Viknesh Sk Work Experience Details
  • Teksystems Global Services In India
    Architect
    Teksystems Global Services In India Oct 2024 - Present
    Bengaluru, Karnataka, India
  • Searce Inc
    Data Architect
    Searce Inc May 2022 - Sep 2024
    Bengaluru, Karnataka, India
    Project 1: IndiGo: (Industry | Aviation)Use case 1: R Scripts triggering | Cloud Composer1.Deployed 200 DAGs to copy R Scripts from GCS to Compute engine & trigger R Scripts in Windows server.2.Implemented a DAG generator DAG that binds Jinja template & config file to create multiple DAGs automatically using yaml.3.DAGs are configured to trigger emails based on success & failure.Use case 2 & 3: Database Migration & Archival (Compute engine (MSSQL) to BQ)1.Migrated 700+ tables from MSSQL to BQ as one time load.2.Deployed Composer Airflow scripts to archive 700+ tables every 15 days.3.Schema for every table is replicated from MSSQL to BQ.4.Composer DAG is scheduled to initiate a Dataflow job to fetch MSSQL tables & archive into BQ.5.Similarly Migrated 40+ tables from MySQL to BQ as one time load. (Cloud SQL (MySQL) to BQ)Project 2: Birlasoft (Haleon) (Industry | Consumer Health care)Use case 1: (SAP-SFMC to BQ)1.A composer DAG is deployed to run a batch pipeline everyday to pull data using SFTPtoGCS operator & write into BQ landing zone for multiple tables.2.BQ SP's are tasked in DAGs to move data in different layers of BQ. Landing to cleansed & curated zones in BQ.3.DAG is deployed to dynamically include tables. It also, handles data load failure & automatically load the failed date's data referring to a date in meta data table.Use case 2: (Abandoned carts | Near-realtime data transfer)1.Idle cart messages for more than 15mins are sent to PubSub by an API.2.This is received by a PubSub topic which triggers a Cloud function which formats & writes into BQ. Another function with same trigger to send message to SFMC for email.3.Data from BQ is sent to SFMC & DV360 through Cloud functions & ComposerBig Query - PII masking - (Personally identifiable information)Automatic scripts are created to apply PII policy tags for taxonomies.
  • Quantiphi
    Associate Technical Architect - Machine Learning
    Quantiphi Mar 2021 - May 2022
    Bangalore Urban, Karnataka, India
    Job Profile:Create GCP Architecture diagram - Create Machine Learning solution procedure - Create ML Data workflows - Create Data Engineering Pipelines - Create Inference pipelines - Test & Demo clients - Prepare TDD documents. (Leading Machine Learning, Platform Engineering, Data Engineering & Software Development teams)Project 1: Oscar - Health: (ICD10 code Extraction & Classification from Medical Claims)Technologies applied: 1.Extracted around 320K pdf files using Cloud Vision API & OCR processors.2.Data pipelines created with cloud composer & Apache airflow. 3.Used caml algorithm to train & develop PQR-AI (Payer quality & risk) model. Model is built using Cloud AI - running around 50 experiments with different data sets & batch sizes. Tuning hyper parameters, step size & sequence lengths using Tensor flow.Project 2: Walden University: (Text to Speech - Podcast Creation)Technologies applied:1.Web-scraped the university site periodically using selenium web-drives operating in GCP App engine.2.Text is summarized & coded to receive SSML files (speech synthesis markup language).3.SSML served as input to Azure-speech engine to generate podcast on different metrics like voice, pronunciation, tone & speed rate.4.Generated podcast is surfaced in podcast app for Walden students in production.Project 3: Hartford Insurance group: (Email & Document - Classification & Entity Extraction)Technologies applied:1.Extracted email & its attached documents text - using OCR processor.2.Classified them using XG-Boost model - classified emails are applied to entity extraction.3.ACORD - insurance domain specific (pre-released-processor) were used to extract entities from attached docs.4.Alternate logic was also used to script a “region of interest” over the docs to pick entities.
  • Bristlecone
    Data Scientist
    Bristlecone Jun 2019 - Mar 2021
    Rmz Centennial, Itpl Main Rd, Whitefield, Bengaluru
    Job Profile:Create ML models - Write python scripts - create AWS data flows - Data science - ETL - Web scraping.Project 1: Corning: (Anomaly Detection - Classification)Technologies applied:1.Pipe lined data from a client database to preprocess & train data.2.Classified each transaction among 10 different anomalies as cost & compliance anomaly categories.3.Used multiple models while training - Light BGM, XG Boost & Random Forest. Validated model using Confusion matrix, Cap curve & regularization methods.4.Entire process occurs on EC2 machine which picks up pickled pre-trained model. Model results were surfaced on an application with its classification.Project 2: Corning: (Trending News - Suggestion)Technologies applied:1.Web-scraped news websites from a list of websites.2.Used tf-idf vectorization & naive bayes model to pick model relative words across websites for suggestion.
  • Nabard Financial Servic
    Senior Data Engineer
    Nabard Financial Servic Aug 2018 - May 2019
    Siddanna Layout, Banashankari Stage Ii, Bengaluru
    Job Profile:Analyze banking data - Write Python scripts - Create ML models - Create end-to-end data engineering pipelines - Design SQL tables.Requirement: Classify performing & non-performing assets (loan payers).Technologies applied:1.Performed descriptive & prescriptive analytics on features such as family, net worth, repayment delay, region.2.Pipe lined data from a database to preprocess & train data.3.Classified an account as performing & non-performing asset based on previous loan repayments using XG boost algorithm.4.Model was launched to production after testing one year old customer data.
  • Mahindra Logistics
    Senior Data Analyst
    Mahindra Logistics Mar 2017 - Aug 2018
    Infantry Road, Bangalore
    Job Profile:Analyze Logistics data - Write Python scripts - Create ML models - Create end-to-end data engineering pipelines - Design SQL tables.Requirement: Cab allocation demand predictionTechnologies applied:1.Applied predictive analysis on cab allocation & routing data from corporate car booking domain data using pandas.2.Multiple features were created based on distance from office to home, region, traffic status during the time, car size & availability. 3.Random forest regression was used to find the value of optimal demand for a car to be allocated.
  • Jll
    Senior Data Analyst
    Jll May 2013 - Feb 2017
    Bangalore
    Job Profile:Data collation - Data Analysis - Prescriptive, Descriptive & Predictive, Implement data web apps - Create regression ML models - Write Python Scripts.Requirement: Data analysis & ERP implementation for IFM & real estate data.Technologies applied:1.Different types of data were analyzed using pandas, regex, charting methods. 2.Implemented multiple web applications & modules required for facilities management & real estate domains. Trained clients & site teams on usage & tracking.

Viknesh Sk Education Details

Frequently Asked Questions about Viknesh Sk

What company does Viknesh Sk work for?

Viknesh Sk works for Teksystems Global Services In India

What is Viknesh Sk's role at the current company?

Viknesh Sk's current role is GCP Data Architect | GCP Cloud | AWS Cloud | Composer - Apache Airflow | Dataproc - PySpark.

What schools did Viknesh Sk attend?

Viknesh Sk attended Karpagam Academy Of Higher Education, Erode Hindu Kalvi Nilayam, Vetha Loga Vidhyalaya, Sri Vijay Vidhyalaya.

Not the Viknesh Sk you were looking for?

Free Chrome Extension

Find emails, phones & company data instantly

Find verified emails from LinkedIn profiles
Get direct phone numbers & mobile contacts
Access company data & employee information
Works directly on LinkedIn - no copy/paste needed
Get Chrome Extension - Free

Aero Online

Your AI prospecting assistant

Download 750 million emails and 100 million phone numbers

Access emails and phone numbers of over 750 million business users. Instantly download verified profiles using 20+ filters, including location, job title, company, function, and industry.