We provide IT Staff Augmentation Services!

Senior Architect/lead Developer Resume

0/5 (Submit Your Rating)

IL

SUMMARY

  • Over 12+ years of Data warehousing systems architecture and delivery experience.
  • A seasoned Enterprise integration architectwho has experienced the evolution of the enterprise integration space over the past 11 years from point to point integration to EAI/ETL.
  • Solutions architectureexperience with complex enterprise applications including Data warehousing /BI implementations.
  • Expertise in implementing Star schema & snowflake dimensional modeling.
  • Expertise in implementation of SCD Type1, Type2 & Type3.
  • Experienced with Junk dimension implementation.
  • Well experienced in writing Pig Latin scripts and Hive queries.
  • Well Versed in data import to Hadoop using Sqoop & Flume.
  • Solid experience in designing Oozie workflows
  • Solutions Architecture experience with Teradata, Informatica, Datastage, Cognose, and UNIX.
  • Experience in Relational Data Modeling and Dimensional Data Modeling, Star Schema Modeling, Physical and Logical Data Modeling, Erwin 4.0/3.x, Oracle Designer.
  • Industry experience in theBanking, Auto manufacturingandLife Sciencessectors.
  • Extensive experience withConduct Risk, Compliance managementapplication implementations.
  • Hands on experience in writing highly complex Teradata BTEQ scripts (Teradata SQL), Flastload, Mload, Tpump on UNIX & Linux platforms.
  • Tuning of SQL to Optimize the Performance, Spool Space Usage and CPU usage.
  • Hands on experience in developing PL/SQL programs for triggers, cursors, procedures & functions.
  • Hands on experience using Informatica Power center Designer, Workflow manager & power center monitor for multiple versions like 7.X,8.X & 9.X and well experienced in developing using advanced transformations like JAVA, Normalizer, SQL, XML, Webservice, HTTP transformation.
  • Experienced with differentRelational databases like Teradata, DB2, Oracle and SQL Server, NoSQL
  • Hands on experience in using features like Pushdown optimization, Power exchange IDQ and hands on in developing Informatica mappings for slowly changing dimensions type 1, 2, & 3 mappings. Experience in developing advanced workflows.
  • Extensive experience with Extraction, Transformation, Loading (ETL) process using A Data Stage.
  • Experience in Hadoop development using HDFS, HBase, Hive, Pig, Sqoop, and Pentaho.
  • Hands - on experience in different phases of software development life cycle especially Business Analysis, technical requirement Analysis, DB Designed design, effort Estimation for each phase, development and testing.
  • Strong interpersonal and good communication skills. A strong contributor towards knowledge management activities including project documentation, user manuals and other technical documentation.
  • Experience with Agile (Scrum) & waterfall methodologies

TECHNICAL SKILLS

Hadoop echo systems: Hadoop, HDFS, Hive, MapReduce, Pig, Sqoop, Flume, Zookeeper HBase, YARN

RDBMS: Teradata, Oracle, SQL Server, DB2 and MySQL

Operating Systems: Linux, UNIX & Windows 7

Languages: Java, PL/SQL, SQL, No SQL, PIG Latin, Shell Scripting, Java Scripting, Perl.

Tools: & utilities: Informatica Power center, Datastage,Cognose,Hadoop,Hive,HBASE,PIG,Scoop,MapReduce,Teradata Manager, SQL Assistant, BTEQ, Fastload, Mload & TPT,TMM Ferret, TSET, NCR Put and various administrative tools like cnstool, Hadoop, Hive, HBASE, PIG, Scoop

PROFESSIONAL EXPERIENCE

Confidential

Senior Architect/Lead developer

Responsibilities:

  • Involved in various phases of Software Development Life Cycle (SDLC) as requirement gathering, modeling, analysis, architecture & development and project was developed using Agile Methodologies.
  • To gather requirements based on Project discussions and Prepare Design documents based on requirements.
  • Design the ETL strategy, define and document integrated design document for ETL.
  • Define source to target mapping document based on the BRD.
  • Develop Hive QL scripts to export the data from source systems.
  • Once the data is prepared using Hive, files will be moved from Hadoop to local file system
  • Informatica jobs will be used to consume this extracted data.
  • Once the ETL process is completed XML and CSV files will be generated.
  • Business users will approve and push the files to regulatory designated location.
  • Develop Hadoop eco system sqoop script to load the data fixes to source system databases landing zone.

Confidential, IL

Lead developer

Responsibilities:

  • Worked with business teams to gather end to end requirements.
  • Feasibility analysis
  • Data profiling & assess the data quality
  • Walk through of initial model and ETL strategy and feasibility analysis with project stake holders
  • Defined star schema model with various facts and dimensions
  • Collect the additional feedback and reviews from the stake holders.
  • Design the actual model using Erwin data molder.
  • After gaining confidence from various stake holders implement the final model
  • Handover the DB scripts to create the physical data model to application DBA

Environment: Erwin & SQL Developer

Confidential

ETL developer/Data modeler

Responsibilities:

  • Worked with business teams to gather end to end requirements.
  • Feasibility study
  • Data profiling & assess the data quality
  • Walk through of initial model and ETL strategy and feasibility analysis with project stake holders
  • Defined star schema model with various facts and dimensions
  • Collect the additional feedback and reviews from the stake holders.
  • Design the actual model using Erwin data molder.
  • After gaining confidence from various stake holders implement the final model
  • Handover the DB scripts to create the physical data model to application DBA

Environment: Erwin & SQL Developer

Confidential

ETL Design/Developer

Responsibilities:

  • Designed and developed the interface modules between the Oracle and Java.
  • Load data from Mainframe Datasets to Teradata stage area using Datastage.
  • Perform transformations from stage to dimensional area using Teradata BTEQ.
  • Load data from the different Databases into HDFS using Sqoop and then into Hive tables, which are partitioned.
  • Performed Geospatial trend analysis using Teradata BTEQ (production) and as part of POC using Map Reduce.
  • Developed pig scripts to transform the data into structured format and automated it through Oozie coordinators.
  • Developed Hive queries and Pig Scripts as per design.
  • Performed scenario based unit testing of MapReduce jobs using PigTest framework.
  • Developed Oozie Workflows for daily incremental data loads from Oracle into hive tables.
  • Developed MapReduce job using Java API for complex requirements involving custom practitioners and comparators to segregate data with embedded key values.
  • Managed and reviewedHadooplog files.
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Collaborating with application teams to install operating system andHadoopupdates, patches, version upgrades when required.
  • Designed ETL flow for several newly on-boarding Hadoop Applications.
  • Reviewed ETL application use cases before on-boarding to Hadoop.

Environment: - Linux (RHEL), Hadoop, HDFS, MapReduce API, Hive, Pig, Oozie, Sqoop, Oracle, XML, Shell Script, Eclipse, MRUnit, PigTest

Confidential

Teradata ETL design/Development

Responsibilities:

  • Coordinating with the onsite client & offshore client.
  • Load data from the different Databases into HDFS using Sqoop and then into Hive tables, which are partitioned.
  • Performed Geospatial trend analysis using Teradata BTEQ (production) and as part of POC using Map Reduce.
  • Developed pig scripts to transform the data into structured format and automated it through Oozie coordinators.
  • Developed Hive queries and Pig Scripts as per design.
  • Performed scenario based unit testing of MapReduce jobs using PigTest framework.
  • Developed Oozie Workflows for daily incremental data loads from Oracle into hive tables.
  • Developed MapReduce job using Java API for complex requirements involving custom practitioners and comparators to segregate data with embedded key values.
  • Functional & technical requirement analysis.
  • Preparing Integrated requirement document.
  • Preparing Integrated design document.
  • Evaluating construction done from
  • Database Design(indexing, dimensions, fact)
  • Load strategy
  • Involving with Development team to improve the BTEQ scripts
  • Preparing unix shell scripts
  • Designing test procedure for final reports

Environment: - Linux Shell scripting, Teradata BTEQ, Fastload, Mload, Informatica PWC cognose

Confidential

ETL Design/Development

Responsibilities:

  • Functional & technical requirement analysis.
  • Preparing Integrated requirement document.
  • Preparing Integrated design document.
  • Evaluating construction done from offshore.
  • Design & technical issue resolving.
  • Involving with ETL team to improve the performance of existing Informatica jobs.
  • Preparing Unix shell scripts
  • Constructing mappings.
  • Designing test procedure.

Environment: - UNIX, Informatica, Oracle, SQL/PL SQL, XML, Shell Script

Confidential - Washington

ETL Developer

Responsibilities:

  • Developing BTEQ,Fastload,mload Tpump scripts
  • Archiving and restoring data from Teradata version to version
  • Exporting data from Teradata using fast export.
  • Installing Teradata on different flatforms and upgrading the databases
  • Making functioning and non functioning of TD system using TPA reset utility
  • Monitor system performance status using PMON
  • Configure VPROC’s using VPROC Manager
  • Improve resource partition capabilities using Priority Scheduler
  • Monitor, Control and Administer TD system using TD Manager
  • Perform Analysis & Comparison, Capturing Query plan, Fine tuning definition Using Visual Explain
  • Configure clusters for Falbacking system & Scalability
  • Define Views, macros & configure data dictionary view using DBW Supervising screen, DIP& BTEQ utility
  • Config TD system using SysInit, Config,Reconfig,
  • DIP script
  • Grant and Revoke perm,spool,temp spaces
  • Create, Grant and Revoke access rights, Roles and Profiles
  • Tuning and establishing system values and debug diagnosed problems using DBS control utility
  • Analyze the workload and provide recommendation and provide what if analysis using INDEX Wizard and Stats Wizard
  • Use PMON for performance purpose in real time and capacity planning
  • Use TSET for simulating production like environment in test matching and exporting workloads
  • Manage disk space utilization using Ferret
  • Archive & Restore database and tables using ARC utility
  • Worked with TD Admin for creation, modification, drop and cloning of users or databases
  • Worked with Recovery manager to view and interact with transaction recovery and amp recovery
  • Upgrading or downgrading DB version on different platforms like windows, susi Linux and UNIX MPRAS using NCR PUT utility
  • Configure mainframe according to production requirements of Teradata database

We'd love your feedback!