Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

KISHOR MOHITE

Mobile : +91-9168227288
Mail : kishor.mohite@gmail.com
LinkedIn : linkedin.com/in/kishor-mohite-a901b667

Summary
 Over 18 years of Software Design, Development, Delivery and Management experience.
 Actively participated and led the Business/System Consulting discussions, Software Architecture,
Project Estimations, Design, Development, Integration, Testing & Deployment.
 Business domain expertise includes Life insurance, Manufacturing, Supply Chain Management,
Order Management. 3+ years onsite work experience in United States at Chicago/Moline.
 Well versed with Water Fall and Agile software development methodologies.
 Flexible, innovative, open minded with effective communication and always focused on building
strong client relationship.

Certifications

 AWS Certified Cloud Practitioner.


 AWS Certified Solution Architect.
 PCEP – Certified Python Programmer.
 Scrum Inc - Registered Product Owner.
 Scrum Inc - Registered Scrum Master.
 SAS Certified Base Programmer for SAS 9.
 IBM Certified Database Associate DB2 9 Fundamentals.

Technical Skills

Operating System MVS/ESA, Unix, Windows.


Languages Python, C/C++, JAVA, JavaScript, Servelets, JSP, COBOL, JCL, SQL, SAS,
ENFORM, MapReduce, ABAP
Online/UI IMS-DC, CICS, Screen COBOL, Wedynpro, HTML, JavaScript, JSP
Databases Postgres Arora, DB2, VSAM, IMS-DB, Enscribe, HBase, SAP HANA, SAP MDG
Utilities/Tools Github, Jenkins, Databricks, RMJ, File-Aid, XPEDITER, Change Man, Panvalet,
SPUFI, QMF, ESP, PIG, Hive, PUTTY
Middleware MQ, SAP-PI
ETL Informatica, Pentaho, BODS
AWS/BIG Data S3, Route53, API Gateway, VPN/Subnets, IAM, CloudFormation, EC2,
CloudWatch, Lambda, Redshift, EMR, Athena
Software Methodology Aile, Waterfall

Employment History

Organization Designation Duration


JOHN Deere India Pvt Ltd Engineering Manager Jan-2017 to till date
JOHN Deere India Pvt Ltd Technical Lead Oct-2011 to Jan-2017
JOHN Deere India Pvt Ltd Associate System Analyst Jan-2007 to Oct-2011
EDS (Electronic Data Systems Pvt Ltd) Information Associate Apr-2006 to Jan-2007
CGI India Pvt Ltd Software Engineer June-2003 to Apr-2006

1 of 6

Company Use
Education

Commencement Completion
Qualification University/ Institution Percentage
month & year month & year
Amrutvahini College of Engineering,
B.E. Computer 04/1998 05/2002 66 %
University of Pune, INDIA
12th (HSC- Shree Nagesh Vidyalaya, Jamkhed,
04/1995 03/1998 82%
Science) Maharashtra, INDIA
Shree Nagesh Vidyalaya, Jamkhed,
10th (SSC) 03/1994 03/1995 84.57%
Maharashtra, INDIA

Project Experience Details

Company JOHN DEERE India Pvt. Ltd.


Project Product, Parts & EPDM
Duration Jan 2017 – till date
Role Engineering Manager (Jan 2020 – till date)
Team Size 35
Tools & Environment AWS (S3, Route53, API Gateway, VPN/Subnets, IAM, CloudFormation, EC2,
CloudWatch, Lambda), SAP HANA, BODS, Mainframe Tech Stack (COBOL
85, IMSDB/DC, VSAM, DB2, SAS, JCL, Panvalet, ESP, File- AID), Informatica,
JAVA

Description:
Product & Parts is an application responsible for maintaining Master data for all Deere machines
(complete goods) as well as material information that makes up the whole good. All machine and part
operations within Deere do need to interact with Product & Parts Master data to conduct day-today
transactions. The Master data is well connected with Deere factory and warehouse operations across the
globe and is among the critical systems within Deere to make sure its operations work smoothly.
1) Product holds machine serial Number and associated components and configuration
information of a machine. It also places the machine into specific product hierarchy which is
essential to classify machines within Deere. All the on-field and off-field operations of the
machine are heavily dependent on identity, configuration and categorization data which
Product application holds.
2) Parts is an application which maintains all the material information that get manufactured
within Deere OR at supplier end. Along with basic material information like part dimensions,
name, manufacturing unit etc., the application also is responsible to maintain rules of
shipping and packaging information of the parts. All the parts ordering application as well as
warehouse operations within Deere need to interact with Master Data of Parts to get the
related part information and attributes.

Responsibilities:
● Responsible for Onsite – Offshore Project Delivery.
● Project/Resource Management, Architecture and Design activities.
● LSM implementation to improve satisfaction at both product and customer level.
● Lead architecture and design of large-scale complex projects aligned with the business, information
and technical vision and as per enterprise standards with the goal of ensuring consistent, fit-for-
purpose design.

2 of 6

Company Use
● The design included systems analysis and design, system modeling, performance modeling,
integration planning and technology and component selection to ensure architectural consistency
and coherence between technical and business processes.
● Ensuring maintainability of the system by appropriately decomposing the system into modules,
services and components.
● Ensuring that the design fulfils projects non-functional requirements like scalability, performance,
transactional, concurrency and security.
● Managed Lead capacity planning, infrastructure designs, performance tuning and data modeling.
● Lead the design of technical solutions required to enable data integration and interoperability
across applications.
● Trouble-shoot complex project implementation related with technical challenges.
● Responsible for all project deliveries w.r.t. design completion, design standards conformance and
code quality.
● Served as a pivotal point for technical deliverables.
● Identification and implementation of Value additions in customer service offerings.

Company JOHN DEERE India Pvt. Ltd.


Project Information Knowledge Center (IKC)
Role Technical Lead
Team Size 16
Duration Oct 2011 – Jan 2017
Tools & Environment JAVA, JSP, COBOL 85, IMS DB/DC, VSAM, DB2, SAS, JCL, Panvalet, ESP, File-
AID, Informatica, JAVA, SAP HANA, BODS.

Description:
IKC (Information Knowledge Center) domain area within Deere is responsible to maximize Company
decision making by providing integrated data, knowledge and analytic services. The primary services
offered by IKC are
1) ETL / Data Warehouse / Data Mart Services: This refers to all the activity required to keep
the IKC databases up and running. This process includes the management of the day-to-day
processes that keep the data warehouse / data mart operational. This includes database
operations to support and maintain the database system and the ETL process to keep the
data warehouse / data mart populated with up-to-date information. In addition, this process
includes maintaining documentation, monitoring metrics and data quality issues, and
maintaining metadata associated with the data warehouse.
2) Data Quality : Data Quality refers to how well data supports the business activities of an
organization. It includes Data Integrity (how well data is maintained), Data Accuracy (how
well data represents the real world), Data Completeness (ensures that all data needed to
support business activities are available).
3) Master Data Information: Managing the Life cycle, usage, quality standards, security, access
rules and documentation of master data for subject area like Customers, Channels, Facilities,
Human Resources, Materials, Organizations, Products, References Suppliers, Financials.

Responsibilities:
● Build design and provide estimate for prioritized product feature and work with SM to remove the
blocks.
 Build new ETL process using Informatica and test them for performance optimization.
 Application design, code and maintenance. Design new interface to bring data from multiple
sources to data warehouse. Data modeling for new tables OR existing table change.
 Diagnosing and resolving technical problems. Fix support and development technical issues and
document them.

3 of 6

Company Use
 Do a peer review of the work completed by team members. Do a pair programming with new
developers to groom them for unfamiliar environment.
 Document testing issues and defects and review with the development team to fix the root
cause. Get input from PO to clarify functional requirements.
 Providing support to pre-implementation and implementation activities. Schedule and monitor
deployment activities. Be available 24*7 after deployment to resolve any support issues.
 Update the project documents with all necessary technical details of new application
development to make it easier to support.

Company JOHN DEERE India Pvt. Ltd.


Project PDC
Role Associate System Analyst
Team Size 12
Duration Jan 2007 – Till Oct 2011
Tools & Environment JAVA, COBOL 85, SCREEN COBOL, TANDEM COBOL, IMS DB/DC, VSAM, DB2,
JCL, MQ Series, Enscribe, Change man, Panvalet.

Description:
PDC (Parts Distribution Center) is a distributed application system which manages the order fulfillment
process of DEERE & Co worldwide. The application is divided into multiple subsytems as follows
1) Material Management: Manages the inventory and forecasting of the parts/complete goods
2) Order Management: Order placement process either by customer/dealer/warehouse
3) Distribution Management: The entire process of moving parts from warehouse to
customer/dealer location
4) Financial Management: Billing and Invoicing of the orders

Responsibilities:
 Clarify the requirements doubts from project/tech lead before start of new application
development.
 Build new screens and design new database to meet data storage requirements.
 Complete design/development as per requirement and do a peer review with Tech Lead.
 Performing routine maintenance on application screens, code and databases. Add/change fields
on the screen. Data Model preparation and create/update tables on database.
 Diagnosing and resolving technical problems while doing development as well as while working
on application support issues.
 Involved in peer review process for new development and log the review comments.
 Documenting Issues and defects, then review with Tech Lead or Business Analysts.Do a root
cause analysis while working on assigned defect.
 Demo newly developed application functionality to Tech Lead and get the acceptance for
production deployment.
 Add comments and explanation in the program to explain newly written code. Tag the changes
with the userid to identify what change was done. Do a code compare prior to production
deployment.

Company EDS India Pvt. Ltd.


Project xpedx
Client xpedx
Role Information Associate
Team size 8
Duration Apr 2006 – Jan 2007
Tools & Environment COBOL, DB2, IMS DB/DC, QMF, File aid, SPUFI, MQ Series, Changeman, SQL, JCL.

4 of 6

Company Use
Description:
xpedx, an International Paper company, is a customer-driven distributor of printing paper, packaging
supplies and equipment. The xpedx market focus is retail, commercial printing, on-demand printing, heavy
manufacturing, the automotive industry, hi-tech manufacturing, publishing and national accounts.

ACCESS is a Mainframe based distribution application system owned by xpedx / International Paper.
ACCESS manages the supply chain for the distribution business and also provides many customized
programs for large National Account customers.

Responsibilities:
 Application Development, enhancement and maintenance as per technical specification. Split the
assigned work in user story and work with Tech Lead/Product Owner to add them to sprint/PSI
planning.
 Diagnosing and resolving technical problems. Document the technical issues. Do the root cause
analysis and add it to checklist to make sure the same issue does not occur again.
 Work with database team to implement change in data model and complete data migration. Test
application functionality after database changes and share the results with Tech Lead.
 Update the progress of assigned user story in rally and report any development blocks during daily
stand-up with the team.
 Perform unit, regression and integration testing and get it approved prior to any new deployment
to production.
 Analyze the defect raised by testing team and track the status of the defect till closure.
 Do the impact analysis for migration of code from IMS to DB2 database.
 Build new screens in IMS DB/DC.Integrate IMS transactions with JAVA Web services and
perform end-to-end integration testing.

Company CGI Information and Management Systems India Pvt. Ltd


Project MANULIFE
Client John Hancock
Role Software Engineer
Team size 10
Duration Jun 2003– Apr 2006
Tools & Environment Cobol, IMS DB/DC, CICS, VSAM, DB2, JCL, Change man, ESP (Scheduler).

Description:
MANULIFE-Compensation and Commission System consists of four applications namely Commission,
Compensation, Replacement Insurance Detection System (RIDS) & Persistency. The commission system
decides about the commissions, services fees etc. GA Compensation Module deals with all kinds of
Compensation related to General Agents. The Replacement detection system was developed to detect
replacements (for insurance policies, annuities etc. and to generate transactions that would adjust an
agent's commission when necessary) & Persistency, which is also called as 5-year persistency index system,
captures and retains the policy and agent data pertinent to the calculation of specific agent/ agency
persistency indexes.

Responsibilities:
 Prepare low level technical design from given specification and get it reviewed from Tech Lead.
 New application code development as per the technical specification and timeline.
 Verify program code as per the standard checklist followed by the project/team.
 Prepare test plan for new development listing all positive and negative test cases and execute the
test plan.

5 of 6

Company Use
 Responsible for populating the deployment tracker for the list of activities of the assigned project
task and sharing the status with the Tech Lead/client after deployment.
 Update Rally with the daily progress made of the assigned task and give demo to get the
acceptance of new development.
 Provide application support to solve end user questions about functionality.
 Build and test new job schedule and evaluate it.
 Do the impact analysis from VSAM to DB2 migration.
 Load and unload data in DB2 and optimize the SQL.

Personal Details

 Date of Birth : 07/06/1980


 Marital Status : Married
 Nationality : Indian
 Sex : Male
 Language : English, Hindi, Marathi
 Passport No : L7683366

6 of 6

Company Use

You might also like