We provide IT Staff Augmentation Services!

Data/sql Developer Resume

4.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY

  • Seasoned IT Professional wif 6 years of experience in Business Analysis, Financial Data Analysis, Data Modeling, Business Function specifications, developing Business process modeling flow diagrams, Project Management, Risk Management, and performing GAP analysis.
  • Excellent knowledge of industry standard methodologies like System Development Life Cycle SDLC, Iterative Development Life Cycle Processes such as Rational Unified Process RUP, XP and Agile methodologies.
  • Handling a team working on providing Global support for Couchbase Database Technology. Identify problems early and work them to resolution wif customer and internal teams.
  • Familiar wif how reporting software work (Tableau/MSTR/SSRS etc.)
  • Analyzed the business and wrote Business Rules Document.
  • Implemented and followed a Scrum Agile development methodology wifin the cross functional team and acted as a liaison between the business user group and the technical team.
  • More TEMPthan 3yrs of experience inInformatica Power center 8.x, 9.x,InformaticaPowerExchange8.x.,Teradata12.
  • Expert in providingETL solutionsfor any type of business model.
  • UsePythonto modifySQLandBashscripts. UseOSpackage to create and run dynamic shell scripts in python. UsePythonto convert data fromnested Jasonstructure toPandas DataFrame.
  • Worked wif AWS Cloud platform and its features which includesEC2, VPC, RDS, EBS, S3, CloudWatch, Cloud Trail, CloudFormation and Autoscaling etc.
  • Provide regular CX KPI reports and present to the senior management, functional departments as well as external partners.
  • Created Hive tables as per requirements, internal or external tables defined wif appropriate static and dynamic partitions, intended for efficiency.
  • Worked extensively wif HIVE DDLS and Hive Query language(HQLs).
  • Extensively used T - SQL in constructing User Functions, Views, Indexes, User Profiles, Relational Database Models, Data Dictionaries, and Data Integrity.
  • Extensive working knowledge and experience in building and automating processes using Airflow.
  • Excellent experience of creating data models for relational schemas(OLTP) and dimensional schemas(OLAP)wif propernormalization and DE normalization techniquesand developedDatabase SchemaslikeStar SchemaandSnowflake Schema.
  • Experienced in various diagramming techniqueslike wireframes, Process maps, flowcharts, functional demonstration, context diagrams and BPM modelling techniques.
  • Strong Teradata skills that include build/maintain Teradata tables, Views, Constraints, Indexes, SQL & PL/SQL scripts, Functions, Triggers and Stored procedures.
  • Implemented slowly changing dimensions of Type 1, Type 2 and Type 3 and Performance tuning of sources, mappings, targets and sessions in Informatica and SQL queries.
  • Used Agile methodology for repeated testing.
  • Involved in Manual Testing by checking of various validations.
  • Created, maintained, and executed manual test scripts.
  • Developed Functional Specifications and Testing Requirements using Test Director to conform to user needs.
  • Knowledge of scripting for automation (e.g., Python, Perl, Ruby)

TECHNICAL SKILLS

Programming Languages: Python 3 (Advanced), JIRA, Agile, SAS/MACRO (Certified), MATLAB (intermediate), PHP (Intermediate),HTML5 (Intermediate), CSS3 (Intermediate), JavaScript (Intermediate), XML (Intermediate)

Frameworks: Flask (Intermediate), Angular 8(Intermediate)

Data Visualization Tools: Power BI (Certified), Tableau (Advanced), AWS, MS Excel (Advanced), Qlik View (Advanced)

Virtualization Tools: Docker (Intermediate), Virtual Box (Intermediate)

Databases: Oracle (Advanced), SQL (Advanced), MySQL (Advanced), Airflow, RedShift (Intermediate),Microsoft SQL Server (Intermediate),Couchbase, MongoDB (Intermediate).

Big Data Tools: Data Bricks (Intermediate), Hive, Hadoop (Intermediate), Hive (Intermediate)

Operating Systems: Linux - Debian (Intermediate), Windows Server 2016 (Beginner)

PROFESSIONAL EXPERIENCE

Data/SQL Developer

Confidential

Responsibilities:

  • 5+ years of experience wif relational database systems, including Oracle, SQL, and PL/SQL
  • Created an aggregated report daily for the client to make investment decisions and halp analyze market trends.
  • Install, monitor and upgrade NoSQL (Couchbase and MongoDB)Databases on all environments (DEV, SIT, UAT, PRODFIX, PROD).
  • Created SQL queries for diverse usage ensured integrity of data wif frequent restoration and back up using PostgreSQL.
  • Assisting wif the designing and development of upfront test cases using
  • Involved in extraction, transformation and loading of data directly from different source systems (flat files/Excel/Oracle/MSSQL/Teradata) usingSAS/SQL, SAS/macros.
  • Hands on experience wif scripting in T-SQL in constructing tables, indexes, user profiles, relational database models, data dictionaries and data integrity, optimized indexes and statistics in relation to reporting data model.
  • Couchbase cluster Tuning and Configuration. Operationalization and Runbook Creation. Review & Provide Subject Matter Expertise during the course of the project kickoff.
  • Used AWS command line client and management console to interact wifAWSresources andAPIs.
  • Leverage ETL tools like Informatica/Wherescape/Oracle Warehouse Builder to ingress and egress information in large quantities using services, real time and batch processes.
  • Generated PL/SQL scripts for data manipulation, validation and materialized views for remote instances.
  • Developed multiple MapReduce jobs in python for data cleaning and preprocessing.
  • Create SSIS packages for migrating data.
  • Strong experience wif reporting tools like Tableau or other reporting packages.
  • Informatica ETL development to integrate ThinkFolio wif upstream source systems and downstream target systems.
  • Delivering high quality and innovative abinitio graphs/informatica workflows.
  • Created automated pipelines in AWS Code Pipeline to deployDockercontainers in AWSECSusing services like Cloud-Formation,CodeBuild,CodeDeploy,S3andpuppet.
  • Worked on SQL Server Integration Services SSIS to integrate and analyze data from multiple homogeneous and heterogeneous information sources CSV, Excel and SQL.
  • Knowledge of a programming or scripting language (R, Python, Ruby, or JavaScript)
  • Configured the above jobs in Airflow.
  • Helped develop validation framework using Airflow for the data processing.
  • Responsible for developing requirements and creating requirements documents and process flows.
  • Good knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, Secondary Name Node, and MapReduce concepts.
  • Understanding the client business problems and analyzing the data by using appropriate Statistical models to generate insights.
  • Created reportson bothrelational databasesas well as againstanalytical cubes (SSAS).
  • Experience wif Hadoop stack (HIVE, Pig), Airflow, sqoop, and MapReduce
  • Initiating alarms inCloudWatchservice for monitoring the server's performance,CPU Utilization, disk usage etc. to take recommended actions for better performance.

Environment: MS Office 2007, MS Visio 2003, sql, AWS, Couchbase, Python, JIRA, Hadoop, Agile, CVS,SSIS, Decker, ETL, AWS WCS, SAS, PL/SQL, Postgresql, Airflow, Node, Hive, Tableau, Share Point, PowerPoint, MS Project, Windows XP, UML, SQL, Test Director, MS Outlook

Data Analyst

Confidential

Responsibilities:

  • Conducted independent statistical analysis, descriptive analysis and Logistic Regression.
  • Create and maintain shell scripts to backup and restore Oracle and PostgreSQL databases.
  • Expertise in writing dynamic-SQL, complex stored procedures, functions and views.
  • Performed ad hoc analysis of data sources for all external and internal customers.
  • Imported the claims data into Python using Pandas libraries and performed various data analysis.
  • Created interactive dashboards and visualizations of claims reports, competitor analysis and improved statistical data using Tableau.
  • Assist wif incident resolution by working wifCouchbaseengineering Support team Assist app teams during stress testing application wifCouchbase.
  • Assist wif designCouchbasehigh availability and failover capabilities, to meet application requirement.
  • Supported technical team members for technologies such as Microsoft Excel.
  • Responsible for configuring, integrating, and maintaining all Development, QA, Staging and Production PostgreSQL databases wifin the organization.
  • Formulated procedures for data extraction, transformation and integration of health care data.
  • Used Excel's VLOOKUP's to determine the customer data and created pivots to easily access and validate data.
  • Evaluating NoSQL products like Couchbase.
  • Involved in creating complex stored procedures according to business logic and also for the creation of RDL for reporting. Experience designing DAGs using AirFlow/Luigi/AWS Data Pipeline. (AirFlow preferred).
  • Worked wif nested Stored Procedures and Functions.
  • Involved in the performance tuning of the existing SQL code.
  • Created objects like tables and views and developed SSIS packages to load data.
  • Implemented Incremental Loading into the target tables/views in SSIS.
  • Created SSIS packages for moving data between databases.
  • Involved in maintaining the Error-Logging for the SSIS packages.
  • Extensive knowledge in Client/Server Technology, GUI Design, Relational Database Management Systems RDBMS, and Rapid Application Development Methodology.
  • The data is moved wif one-to-many relationships between tables and many to one relationship between tables.
  • Assisted the team for standardization of reports using SAS macros and SQL.
  • Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data.
  • Converted data into actionable insights by predicting and modelling future outcomes.
  • Analyze data using data tools such as Python, Tableau, SQL and/or Excel in order to identify program risks.
  • Importing preprocessed data from RDBMS into SAS and performed Statistical analysis using SAS/Macros, SAS/SQL to TEMPeffectively reduce the coding time and identifying patterns of risk.
  • Created reports and dashboards, by using Tableau, to explain and communicate data insights, significant features, models scores and performance of new recommendation system to both technical and business teams.

Environment: Microsoft Outlook, MS Access, MS Excel, DDA, Couchbash, SAS, SQL, SSIS, SQL Code, AWS, Airflow, Python, Tableau, MS Word, MS Project, MS Visio, Tableau, Hive, TOAD, ETL, Hadoop, JAD,VLOOKUP,QA, ETL Informatica, Teradata Test Track, Peer, Mainframe, OLAP, ERWIN, ER Studio, Clear Quest, Clear Case, Java, DB2 Database

Research Assistant/Data Analyst

Confidential

Responsibilities:

  • Aim of the study is to identify patterns of neural and physiological activity during food selection and associations wif the nutritional content of individuals’ final food selection in VR and RL buffets.
  • Collected quantitative data using various body sensors to capture the neurological and physiological data while making food selections. Performed data annotations, data analysis using data cleaning, Excel, Histograms, boxplots.
  • Experience wif one or more ETL tools like Ab Initio, Informatica.
  • Worked wif Machine learning algorithms like Linear Regressions (linear, logistic etc.) SVMs, Decision trees for classification of groups and analyzing most significant variables such as FTE, Waiting times of purchase orders and Capacities available and applied process improvement techniques.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Knowledge of relational and non-relational databases such as MySQL, MongoDB, LevelDB, and ElasticSearch.
  • Relational Database Management Systems (RDBMS) especially SQL Server
  • Generate the DDL of the target data model and attached it to the JIRA to be deployed in different Environments.
  • Worked onAWS Elastic Beanstalkto deploy, monitor, and scale an application.
  • Create newEC2 instance in AWS, allocate volumes and giving Provisionals usingIAM.
  • Developed T-SQL scripts to create databases, database objects and to perform DML and DDL tasks.
  • Develop the test plan, test conditions and test cases to be used in testing based on business requirements, technical specifications and/or product knowledge.
  • Performed extensiveDataValidation,DataVerification againstDataWarehouse and performed debugging of the SQL-Statements and stored procedures for business scenarios.
  • Used Spark Data frames, Spark-SQL, SparkMLlibextensively and developing and designing POC’s using Scala, Spark SQL andMLliblibraries.
  • Developed SQL scripts for loadingdatafrom staging area to Target tables and worked on SQL and SAS script mapping.
  • Performed transformations of data using Spark and Hive according to business requirements for generating various analytical datasets.

Environment: Informatica Power Center 9.5/9.1, Informatica Cloud, Oracle 10g/11g, SQL Server 2005, Tableau 9.1, Salesforce, RightNow, Eloqua, Web Methods, PowerShell, Unix, Hadoop, MS Office 2007, MS Visio 2003, sql, AWS, JIRA, Agile, CVS,SSIS, Decker, ETL, AWS WCS, SAS, PL/SQL, Postgresql, Airflow, Node, Tableau

Data/Business Analyst

Confidential

Responsibilities:

  • Developed Python scripts to scrape the data from Quarterly and Annual reports to analyze and predict the future stock price
  • Includes data collection, data cleaning, validation and Visualizations of large data sets using Apache Spark, SQL Management Server and Excel. Generated Reports and dashboards using Power BI
  • BuiltS3buckets and managed policies for S3 buckets and usedS3 bucketandGlacierfor storage and backup onAWS.
  • Used Agile methodology and SCRUM process for project developing.
  • KT wif the client to understand their various Data Management systems and understanding the data.
  • Develop and maintain SQL scripts, indexes and complex queries for analysis and extraction.
  • Perform Data Profiling, Data pipelining, and Data Mining, validating and analyzing data (Exploratory analysis / Statistical analysis) and generate reports
  • ETL process to clean and load large data extracted from several websites (JSON/ CSV files) to the MYSQL server
  • Created reports and dashboards by connecting multiple data sources to halp track user trends, behavior, engagement and performance.
  • Running SQL scripts, creating indexes, stored procedures for data analysis
  • Worked on data that was a combination of unstructured and structured data from multiple sources and automated the cleaning using Python scripts.
  • Extensively performed large data read/writes to and from CSV and excel files using pandas.
  • Tasked wif maintaining RDD's using SparkSQL.
  • Communicated and coordinated wif other departments to collection business requirement.
  • Tackled highly imbalanced Fraud dataset using under sampling wif ensemble methods, oversampling and cost sensitive algorithms.
  • Improved fraud prediction performance by using random forest and gradient boosting for feature selection wif Python Scikit-learn.

Environment: Rational Enterprise Suite (Rose, RDD's, ClearCase, ClearQuest), SparkSQL, CVS, PL/SQL, RUP, LoadRunner, Visual Basic, ETL Informatica, SQL, SQL scripts, SQL Server, SQL scripts, Toad, AGILE, JIRA, Oracle, Mainframe, WebLogic, ERStudio, HTML, WinRunner, ERWIN, Project Management.

Support Data/Business Analyst

Confidential

Responsibilities:

  • Worked wif the data analyst to gather requirements from the stakeholder’s user stories and facilitated story points to discuss project specifications.
  • Gathered business requirements through interviews, surveys wif users and Business analysts.
  • Worked on Software Development Life Cycle (SDLC) wif good working knowledge of testing, agile methodology, disciplines, JIRA, tasks, resources, and scheduling.
  • Performed logical data modeling, physical Data modeling (including reverse engineering) using the Oracle SQL Developer tool.
  • Involved in converting Hive/SQL queries into transformations using Python.
  • Developed stored procedures in SQL Management Server to standardize DML transactions such as insert, update and delete from the database.
  • Developed SQL Queries to fetch complex data from different tables in databases using joins, database links.
  • Performed Data analysis of existing data base to understand the data flow and business rules applied to Different data bases using SQL.
  • Consistently ensure a reliable support to Couchbase customers worldwide.
  • Created SSIS package to load data from Flat files, Excel and Access to SQL server using connection manager.
  • Developed all the required stored procedures, user defined functions and triggers using T-SQL and SQL.
  • Used MS Visio and Rational Rose to represent system under development in a graphical form by defining use case diagrams, activity, and workflow diagrams.
  • Wrote a complex SQL, PL/SQL, Procedures, Functions, and Packages to validate data and testing process.
  • Performed Data Analysis and Data validation by writing SQL queries using SQL assistant.
  • Additional Languages: Python, Cloud Formation, JavaScript, HTML 5, R

Environment: Windows XP, Rational Rose, JIRA, MS Visio, SQL, PL/SQL, Procedures, agile methodology, Rational Requisite Pro, UML, MS Project, Oracle 9i, MS Office, UML, MS Word, SQL, SQL Queries, SSIS, T-SQL, SQL server, Clear Quest, Test Director

We'd love your feedback!