We provide IT Staff Augmentation Services!

Aws Cloud Architect/big Data Lead Resume

0/5 (Submit Your Rating)

SUMMARY

  • Over 14 years of IT experience in Analysis, design, development, testing, Implementation, Support and Operations phases of applications
  • 3 years of experience implementing AWS Cloud infrastructure and Big data solutions.
  • Proficient in AWS hosting technologies: VPC, EC2, ELB, RDS, Lambda, SES, SNS, API Gateway, etc.
  • Proficient inAWSsolutions architecture, Systems Design, Disaster Recovery and Storage Administration
  • Experience with various Big data technologies - Hadoop, Map reduce, HDFS, Spark, Hive and Sqoop, handling very large data sets in Data Lake setup.
  • Strong knowledge on AWS Big data solutions - EMR, Kinesis, Redshift, Glue, Athena
  • Experience with monitoring solutions, such as: AWS CloudWatch, Splunk and SiteScope
  • Knowledge
  • Strong experience in leading Level-2 and Level-3 application Development/support teams
  • Over 8 years of working experience as ETL/Informatica Lead and Architect
  • Excellent Knowledge in Financial Services domain with strong focus on Reference and Ratings Data
  • Good understanding on Market data from Reuters, Bloomberg, IDC, CapitalIQ and so on
  • Strong experience various data sets - Market Identifiers, Financial Institutions, Securities, Deals, Maturities, Customers, Prices and Forex rates
  • Expertise in Dimensional Modelling, DW architecture, ETL Frameworks, Design and Development using Informatica and Oracle,
  • Good awareness of Enterprise Data Warehousing (EDW), Analytics, Data Quality, Master Data Management, Data governance
  • Strong hands-on experience in Shell scripting and CRON scheduler
  • Experience in Integration of various relational and non-relational sources such as Oracle Tables, SQL Server Tables, Flat files XMLs, CSV and Web services.
  • Experience with Advanced Scheduling tools such as Autosys and Control-M
  • Knowledge on Business intelligence tools Business objects XI R3, MicroStrategy and OBIEE
  • Over 5 years’ experience in Project Management and Delivery
  • Over 2 years of experience in WebLogic Administration and support
  • Over 2 years of experience in GoldenSource Enterprise Data Management (EDM) suite.
  • Extensive experience in supporting and maintenance of Applications and Platforms with multiple technologies
  • Good understanding of DevOps processes and tools
  • Maintain fault resilient production and disaster recovery environments
  • Collaborate with other internal IT teams, including Operations during Production changes and incidents and actively triage towards resolutions
  • Leading development teams and providing Technical Design & Guidance to the Project
  • Experience in working with the Offshore team to provide technical guidance and accept the deliverables from offshore, review them and deliver it to the Client
  • Strong experience in Agile/Scrum project management methodology
  • Excellent analytical and problem solving skills.
  • Excellent Presentation and overall communication skills

TECHNICAL SKILLS

AWS: VPC, EC2, S3, ELB, RDS, Redshift, SQS, SNS, EMR, Kinesis, API Gateway, Lambda, Glue, Athena

Big Data technologies: Hadoop, Spark, Sqoop, Kafka, Hive, Spark SQL, Spark Streaming

ETL Tools: Informatica PowerCenter 10.x/9.x/8.x/7.x/6.x

Databases: ORACLE 12c,11g/10g/9i/8.x, MS SQL Server 2008/2005, Sybase

Web/Application Servers: WebLogic 12.2.1/12.1.3/10.3.5/9.2 , Oracle FMW- ODSI

ITSM tools: Service Now

EDM Tools: GoldenSource Enterprise Data Management (EDM) suite 8.x/7.x/6.x

Operating Systems: Linux 6/7, Sun Solaris 10, Linux 7/5, Windows XP/7

Scripting: Shell Scripting, SQL, PL/SQL

Database Management Tools: Toad 11/10, PL/SQL Developer, Erwin 7.1/4.1, SQL Loader

Reporting Tools: Business objects XI R3, MicroStrategy and OBIEE

Schedulers: Cron, Autosys, Control-M

Languages: C, C++, Java, XML

Version Control Tools: CVS, VSS, GitHub

DevOps Tools: Ant, Jenkins, Nolio, Github, Artifactory

Monitoring Tools: Riverbed Apm, HP SiteScope, Splunk

Distributed Technologies: SOAP Web Services, RESTful Web Services, JMS

Tools: Eclipse, SoapUI, VisualVM, Altova XML Spy, MobaXterm, Putty

PROFESSIONAL EXPERIENCE

Confidential

AWS Cloud Architect/Big data Lead

Responsibilities:

  • Involved in creating Spark cluster using AWS EMR
  • Worked on Scoop to import data from Oracle to S3 (Landing)
  • Involved in loading data from S3-landing bucket to S3-processed bucket using Spark streaming
  • Implemented Data Ingestion in real time processing using Kafka.
  • Integrated Kafka with Spark streaming for high speed data processing.
  • Configured Spark Streaming to receive real time data and store the stream data to S3.
  • Implemented Spark using Scala and Spark SQL for faster testing and processing of data.
  • Developed Spark jobs to move data from Incremental table to Base table at configurable intervals
  • Optimized HiveQL scripts by using execution engine like Spark.
  • Created new EC2 instances in AWS for the old On premise Solaris LDOM’ Confidential
  • Exported and imported On premise VM’ Confidential to AWS
  • Created new Application Load balancers to replace On premise A10 LLB’ Confidential
  • Managing and creating User accounts, Shared Folders, provided day to day User support, Log management, Reporting, applying Group Policy restrictions.
  • Creating S3 buckets in theAWSenvironment to store files, sometimes which are required to serve static content for a web application.
  • Configuring S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on requirement.
  • Using IAM for creating roles, users, and groups to provide additional security toAWSaccount and its resources.
  • Creating snapshots and images to store launch configurations of the EC2 instances.
  • Involved in Automation ofAWSinfrastructure (compute, storage, network, permissions) using configuration management tools Cloud Formation, Terraform and Ansible.
  • Experience in migrating existing data warehouse from on premise to AWS Redshift using various AWS services
  • Designed tables and columns in Redshift for data distribution across data nodes in the cluster keeping columnar database design considerations.

Confidential

ETL/Informatica/Data Architect

Responsibilities:

  • Design end-to-end ETL processes using, Informatica, PL/SQL and shell scripts.
  • Leading ETL/Informatica development team and providing Technical Design & Guidance
  • Experience in working with the Offshore team to provide technical guidance and accept the deliverables from offshore, review them and deliver it to the Client
  • Troubleshoot and resolve application or infrastructure issues
  • Investigate data issues and service failures and take the necessary follow-up actions to resolution
  • Engage external data providers and internal teams to troubleshoot FTP jobs
  • Provide leadership and Off-Hours support, as needed, to resolve system problems during non-business hours
  • Close monitoring of Production and non-production environments
  • Trouble shooting of ETL failures, Database jobs, shell scripts failures
  • Development of a ETL processes on a need basis
  • Work with CMS team to create change requests for Prod releases and infrastructure changes
  • Work with application Dev team to consolidate release notes coordinate with QA to update Change Test plan and other documents and post production testing
  • Follow CVS process and coordinate with DBA to facilitate DB deployments
  • Perform deployments, provide guidance and support to Offshore Application deployment team
  • Interface with Autosys and Network team to facilitate deployments
  • Communicate Application outages to all internal stake holders
  • Documenting system and process changes
  • Making recommendations to user community and management for improvement
  • Support Linux and Solaris Servers OS patching and Middleware patching
  • Lead the team to perform application, platform and database upgrades
  • Participate and drive support team to work in Organization wide Disaster Recovery exercises
  • Manage incident resolution within the environment, facilitating collaboration among other infrastructure and application support teams as required
  • Collaborate with other IT teams, including Operations during Production changes and incidents and actively triage towards resolutions

Confidential

ETL Architect & Lead/Project Lead

Responsibilities:

  • Understand new ETL requirements or changed requirements to load CreditScope Data and estimate the level of effort to implement enhancements
  • Develop/Modify the ETL design, manage ETL development activities from inception to completion including interdependencies with other teams
  • Design and create the stage database tables as per requirements
  • Analyze requirements and formulate an appropriate technical solution that meets functional and non-functional requirements;
  • Take an active role in the cross-functional design and coordinate with offshore ETL team throughout
  • Build strong relationships with data architecture team, database administration and reporting data consumers and interfaces teams to design warehouse/BI data structures
  • Design and Develop the PL/SQL routines and UNIX shell scripts as per requirements.
  • Create Shell scripts to get the Vendor feeds and load into the Stating tables
  • Coordinate the deployment of the ETL components and dependent objects to Production
  • Closely work and help the Prod support team until all the processes are stabilized.
  • Help in the design of the ETL solution; review the ETL low-level design and design phase deliverables;
  • Manage the build phase and quality-assure the code to ensure that it meets requirements and adheres to standards; resolve difficult design and build issues; troubleshoot bugs
  • Develop mappings to implement complex business logic, transformations and aggregations
  • Plan and execute Quality Assurance testing with the QA team and resolve QA defects

Confidential

ETL Lead/ Project Lead

Responsibilities:

  • Responsible for designing, modifying, testing & implementing all related ETL functions
  • Understanding the Golden Source data model for people data.
  • Designed the ETL data flow to ingest LDAP data into staging layer and CMP DB
  • Designed the ETL data flow to ingest LAWSON data into staging layer and CMD DB
  • Designed and developed of ETLs to distribute and reconcile data between RPM, CORE and OPDM
  • Creates procedures for developing and maintaining metadata and data mapping documents
  • Determines the optimal approach for obtaining data from diverse source system platforms and moving it to the data analytics environment
  • Mentoring and training IT personnel on the Informatica platform
  • Perform problem assessment, resolution and documentation in existing ETL, mapping and workflows as appropriate
  • Involved in performance tuning Informatica mappings, sessions
  • Built ETLS to convert source files to platinum file format
  • Interacting with the source systems team to escalate data issues.
  • Development of UNIX shell scripts to schedule informatica ETLs
  • Unite testing and Bug fixing of the ETLs
  • Migrating ETL, DB and UNIX scripts to QA and Prod environments.

Confidential

ETL Lead & Architect/ Project Lead

Responsibilities:

  • Understanding of cross reference, ratings data and bond pricing data
  • Development and testing of the generic mapplets to fetch data from FITCH and MOODY’ Confidential master
  • Developing mappings using the mapplets and executing them for testing
  • Coordinating with consumers (Clients) to notify about the enhancements to generic mapplets.
  • Development of Unix scripts to FTP the vendor files and to run the Informatica ETLs
  • Developed ETLs to generate reconciliation reports between CORE-CMP
  • Analysis and development to get the bond pricing data from Sybase to Vendor Master (SPREFPROD) and then into Credit Master Platform.
  • Bug fixing and Performance tuning of the ETLs
  • Migrating ETL, DB and UNIX scripts to QA and Prod environments.

Confidential

ETL Lead/Project Lead

Responsibilities:

  • Understanding specific data requirements for FGR and MCRS
  • Development and testing of the ETL’ Confidential to maintain history for T&C’ Confidential
  • Development and testing of the ETL’ Confidential to delete and re-insert the T&C’ Confidential data
  • Development of Unix scripts to FTP the vendor files and schedule the Informatica jobs
  • QA bug fixing
  • Migrating ETL, DB and UNIX scripts to QA and Prod environments.

Confidential

ETL Lead/Project Lead

Responsibilities:

  • Understanding of The Golden Source Data model
  • Understanding of Golden Source EDM tools and Connections and how they process the data.
  • Development of Shell scripts to FTP the files from external vendor locations.
  • Involved in the design discussions of the ETL process.
  • Involved in the development of ETL processes to extract the data from various internal sources.
  • Involved in the development of Archiving and Purging process on various internal databases to support the CDC (Change Data Capture) developed by DBA.

Confidential

ETL Lead/Project Lead

Responsibilities:

  • Understanding the requirements based on Use Case documents
  • Development of ETL processes and Database objects from mapping documents.
  • Coordinating with on-site counter parts while doing development and testing.
  • Unit testing and bug fixing.
  • Implementation and running of ETLS in QA
  • Identify the “root cause” of the defects and Fix them in QA and Regression testing.
  • Reviewing and testing the bugs for accuracy and technical correctness

Confidential

ETL Lead/Project Lead

Responsibilities:

  • Analyzing the Requirements
  • Discuss with project manager on estimations of enhancements
  • Preparation Technical System Design Document and Low Level Design Documents
  • Development of ETL and Database objects
  • Preparation of Test Strategy Document
  • Tracking of enhancements/defects/tickets through QC
  • Find and analyze the “root cause” of the defects of high severity
  • Inform to project manager on SLA violations, critical and high issues
  • Update and provide daily status on activities to the client
  • Preparation of quality related documents and coordinating with quality team in the final inspection.
  • Preparation of implementation plan and doing code walk through with production support team
  • Providing support in the warranty period and fixing the bugs arose in the production.

Confidential

Sr ETL Developer

Responsibilities:

  • Development of various ETL processes and testing using Informatica.
  • Involved in the preparation of Low-level design documents.
  • Involved in Informatica mappings, sessions and workflows
  • Fix / resolve issues (coding, data updates, unit testing)based on severity

Confidential

ETL Developer

Responsibilities:

  • Understanding mapping documents.
  • Development of various ETL processes using Informatica and testing of ETL.
  • Extensively involved in Informatica mappings, workflows

Confidential

Sr DB Developer

Responsibilities:

  • Involved in the development of SQL queries, Procedures and Functions
  • Unit testing and creating Test cases

Confidential

DB Developer

Responsibilities:

  • Involved in the development of SQL queries, Procedures
  • Unit testing and creating Test cases

We'd love your feedback!