T24 Data Migration Tool – Service Mode
This guide aims at explaining the functionality of a non-core T24 Data Migration Tool that is used for Data Upload into T24 from any legacy system.
Introduction The T24 Data Migration tool was originally written using the EB.PHANTOM option. Phantom processing in T24 is gradually moving from EB.PHANTOM to TSA.SERVICE. The Data Migration tool too has been modified to be run as a Service. The original functionalities of
Multi threaded transaction takeover
Takeover performance reporting, have been preserved in the latest version.
Product Overview The T24 Data Migration tool will be used during the pre-Live in any site implementations. It basically covers the following functionalities.
Used the Standard OFS module to perform the Upload. The Option is given to use either OFS or standard jBase WRITE function to perform the Upload.
Validations done on the data, prior to update.
Accepts data in any layout and format, that includes double byte characters (Provided the jBase used for the implementation is 4.1 or higher)
Required less manual intervention during mapping of the incoming file to T24 applications, which is a one time setup.
Supports upload of Local Tables too that are site specific
Perform all standard T24 validations and also includes any special cases to perform local validations for the data being uploaded.
Supports scaling, using the multi threading capabilities of TSA.SERVICE. However, this is dependent more on the hardware capabilities.
Stop/Resume options in case the Server/T24/Database connectivity is lost in the middle of the upload. In this case, the operation will be resumed from where it was paused. Standard TSA.SERVICE transaction management capabilities are available.
Exception Handling is done to report any erroneous data present in the Data file. This report also includes detailed description of the error that is raised during the upload.
Tables The T24 Data Migration tool package consists of the following Tables DM.MAPPING.DEFINITION This application allows the user to define the format of the data in the incoming tape. The following provides the list of fields and the associated description. No.
Field Name
@ID
Description The id of this application is free format text, 35 characters long. A meaningful ID, indicating the application name e.g. CUSTOMER and any subdivisions E.g. IND – Individuals, CORP- Corporate etc should be provided. CUSTOMER-IND CUSTOMER-CORP
1
GB. DESCRIPTION
2
APPLICATION.NAME
3
LOAD.TYPE
4
OFS.ACTION
5
OFS.FUNCTION
6
FILE.TYPE
Free text Description to describe the Upload. Name of the T24 Table to which the Data is been uploaded. Either „OFS‟ or „WRITE‟ to specify if the upload is via OFS module or jBASE Write function A value “VALIDATE” will just validate the contents of the data if it fits into T24 table. Value is allowed in this field only when LOAD.TYPE is set to OFS The following FUNCTION are allowed to be specified to be used by the OFS to process the transaction. (I, R, A, D). “I” is taken as default. The Type of T24 file to be updated. Values could
7
OFS.VERSION
8
IN.DATA.DEF
9
FM.DELIM
10
VM.DELIM
11
SM.DELIM
12
ESC.SEQ.FR
13
ESC.SCR.TO
14
ID.TYPE
15
ID.ROUTINE
16
ID.POSITION
17
ID.LENGTH
18.1
APPL.FIELD.NAME
be $NAU, LIVE or $HIS. Value is allowed only when the LOAD.TYPE is set to WRITE The T24 Application’s Version to be used when uploading through OFS. Value is allowed in this field only when LOAD.TYPE is set to OFS To define the way in which the fields are read. Either DELIM or POSITION or OFS The Delimiter used to identify each fields in the data. Value allowed only when IN.DATA.DEF is set to DELIM The Delimiter used to identify each multi-value fields in the data. Value allowed only when IN.DATA.DEF is set to DELIM The Delimiter used to identify each sub-value fields in the data. Value allowed only when IN.DATA.DEF is set to DELIM When a particular character in the incoming data needs to be converted, the character to be converted must be provided here. Works in conjunction with ESC.SEQ.FR, this is the character to which the ESC.SEQ.FR character will be converted. Either “AUTO” (which system takes care to generate the ID) “DATA” (hardcoded ID in the coming file itself) “ROUTINE” (A routine attached that forms the ID) The Routine that forms the ID for each record uploaded. Value allowed only when ID.TYPE is set to ROUTINE The position in which the ID of each record can be found. Value allowed only when ID.TYPE is set to DATA The Length of the ID in each record in the data file. Value allowed only when ID.TYPE is set to DATA Note – This field is used in reference with ID.POSITION to get the data for the ID from the raw file The field name in the T24 table to which the data is updated (Associated multi-value set till 24.1)
19.1
FIELD.POSITION
20.1
FIELD.LENGTH
The starting position of the data that will be updated in the T24 field (Associated multi-value set till 24.1). Associated multi-value set till 24.1). Value allowed only when IN.DATA.DEF is set to POSITION The length of the data that will be updated in the T24 field (Associated multi-value set till 24.1). Associated multi-value set till 24.1). Value allowed only when IN.DATA.DEF is set to POSITION Note – This field is used in reference with FIELD.POSITION to get the data for the field from the raw file
21.1
FIELD.ATTRIB
22.1
FIELD.VALUE
23
POST.UPDATE.RTN
Either “ROUTINE‟ – indicated the data will be formed from the routine for this field (or) “CONSTANT‟ – a static value that will be updated for the field The Static value that will be updated for the field. Value allowed only when FIELD.ATTRIB is set to CONSTANT The Post update routine , used a s a user exit routine that will be triggered for every record that is been uploaded The OFS source record to be used when uploading data into T24. Two OFS Source records, 1. DM.OFS.SRC
24
OFS.SOURCE
2. DM.OFS.SRC.VAL – With FIELD.VAL set to “YES‟ for cases where field validation processing must be triggered e.g. AC.ACCOUNT.LINK are provided by default.
DM.SERVICE.CONTROL This application is used
To define the company for which the data is uploaded
The location of the incoming data
Control the execution of the data upload process
The following are the fields and associated description No.
Field Name @ID
1
UPLOAD.COMPANY
2
FLAT.FILE.DIR
3
FLAT.FILE.NAME
4
NO. OF SESSIONS
5
RUN.STATUS
Description Must be the same as DM.MAPPING.DEFINITION The company for which the data is uploaded. It is also possible to provide the upload company in the Incoming data. However, once the value is provided here, this value takes precedence over incoming data. The directory where the flat file resides. This must be a valid directory name, e.g. DATA.BP. Actual path or relative path of the directory must be provided The flat file name of the incoming data. Must be an existing file name No of threads that the TSA.SERVICE must spawn for processing this request. The status of the process. This can be either START – Updated automatically by the system when the DM.SERVICE.CONTROL record is “V”erified. STOP – Manual stop request to Stop the take over process.
6
CONTROL.LOG
Contains the processing messages of the Service. Auto updated
How does it work This Section covers the functionalities involved in the DMIG tool. Each of the tables and their related functionalities are explained. Also, sample setups are given. Please Use the Menu MB.DMD for operations using the DM TOOL. On clicking the menu a composite Screen in launched. It has three menus namely data Loading, Upload Details, Record Details to facilitate efficient working with DM tool. A sample screenshot of the Composite Screen is as shown.
Menus Data loading Menu The Data Loading menu contains applications for uploading the data. Mapping Definition
Table – DM.MAPPING.DEFINITION
Purpose This is the base mapping definition table, which maps the incoming data with T24 fields. The following items are defined here. o T24 Table to be uploaded o Type of Load to be done (Example – OFS or flat jbase write) o Type of Action to be performed (Example – Only Validate the data or actual update) o Function to be used (Allowed function are Input, Authorise, Delete, Reverse) If not mentioned, Input function will be used by default. o Way to identify the each field in a transaction/record (Example – by delimiter or position) o
o Associated multi-valued sets to map the T24 field with Data, by giving the position or delimiter to identify the data from the source file o Any Post Update routine that will be called for every transaction/record being uploaded
Sample Setup A Sample Screen shot of the Definition parameter is given below
Detailed Working Procedure
This application is used for defining the mapping of T24 fields with the data received in the flat file from the Client‟s legacy system. This can be defined for different T24 applications (Core as well as locally developed applications).
The ID of a record in this file can be any 35 alphanumeric characters. This need not be the name of the application and any name can be given to a record. It is also possible to define more than one record for an application since this can be used for multiple loading sessions for the same application.
The parameters defined in this file include: o OFS Write or Flat write : While doing data loading, it is possible that data has to be loaded in the core tables or in the locally developed tables. In case of Core Tables the data has to be loaded through the OFS route where for locally developed tables load can be either through the OFS route or can be a flat write. This however depends upon the structure and linkages of the locally developed table. In case the load has to be done on a single table which does not update any other table then the load can be done through a flat write rather than going through the OFS route. o This parameter is governed by the Field “LOAD.TYPE”. This is a mandatory field and the possible values are OFS.LOAD or WRITE. When value OFS.LOAD is selected then the data loading will be done through the OFS route whereas selecting a value of WRITE will enable direct write to the tables. o OFS.ACTION : To Ensure the correctness of the Data been uploaded, a value “VALIDATE” can be set in this field, which does everything with the upload of the data, except that the data is not written to the disk. It checks all the validations/raise exceptions/errors. A value “PROCESS” will straight away does the physical write into the database, after the validation. o File Type : In case of a flat write, the data can be directly written to either of the $NAU, $HIS or LIVE files. This parameter is determined by the value in field “FILE.TYPE”. The possible values in this field are NULL, $NAU and $HIS. In case NULL is selected the write will be to LIVE file else the direct write will be done to the respective files ($NAU or $HIS). o Defining Incoming Data : Generally the incoming data from the client received in the flat file will comprise of several records in the form of text strings. For the purpose of loading these records have to be segregated and this segregation can be done using different methods. The possible methods for segregation include (i) use of delimiters for identifying the records and fields within a record (ii) use of position of different fields e.g. field 7 in a record can be of length 6 and will be starting at a position 25. This way a field can be defined in a record. In this case it is not possible to define the multi-value and sub-value fields since the position for mapping will get fixed and it will not be possible to support the variable field lengths in case of multi-value and sub value fields. o This option is handled through the mandatory field “IN.DATA.DEF” which has 2 options viz DELIM or POSITION. The option DELIM is used for defining the delimiters to be used whereas POSITION is used for defining the position of the fields. o Delimiter Definitions: It is possible to parameterize different delimiters that can be used for identifying the records, multi-values and sub-values. These definitions
can be provided in the fields FM.DELIM, VM.DELIM and SM.DELIM as each of these fields represent one type of delimitation. o ID Generation: In T24 it is possible that for some of the application the ID has to be system generated whereas for certain other applications the ID has be to be provided manually. Accordingly 3 options have been provided in this mandatory field “ID.TYPE” viz. AUTO, DATA and ROUTINE. When the option AUTO is selected the ID will be automatically generated by the system. For option DATA the record ID has to be part of the incoming data. In case of some of the applications require routine for generating the Ids then the option ROUTINE can be used. In this case a single argument is passed and returned back from the routine used for forming the record ID. The name of the ID routine can be defined in the field ID.ROUTINE. The ID position has to be defined in case of data being segregated by fixed positions. This field ID.POSITION becomes a mandatory field if field IN.DATA.DEF has a value of POSITION. o Escape Sequence: There is a provision to allow the user to change certain character in the data from one kind to another. For example if a “,” has to be changed to “:” then this can be done using an associated multi-value field “ESC.SEQ.FR” and “ECS.SEQ.TO”. The value defined in the field “ESC.SEQ.FR” will be changed to the value in the field “ESC.SEQ.TO”. o Mapping: These are a set of associated multi-value fields from “APPL.FIELD.NAME” to “FIELD.VALUE” that are used for mapping either OFS or flat write. This set of fields comprise of APPL.FIELD.NAME, FIELD.POSITION, FIELD.LENGTH, FIELD.ATTRIB and FIELD.VALUE. o Field “APPL.FIELD.NAME” is for defining the name of the field in given T24 application (either core or local). The field has to be defined in the STANDARD.SELECTION record of the application. Either the field name or the field number can be input in this field.It is possible to provide the company for which the record is loaded. To do this, the field name must be set to UPLOAD.COMPANY and the company code must be provided in the associated field position.For LD.LOANS.AND.DEPOSITS, an option to provide both the LD.LOANS.AND.DEPOSITS record as well as the LD.SCHEDULE.DEFINE record is available. The values must be separated using a message separator, defined using “MESSAGE.SEPARATOR”. o Field “FIELD.POSITION” is used to define the position of the field in the record given by the client. For example if there is local reference field in the customer record defined as field number 40.2 (MERCHANT.NUMBER) and this has been provided at 25th position in the data given by the Client then the field
“APPL.FIELD.NAME” should have the value MERCHANT.NUMBER and field “FIELD.POSITION” should have the value of 25. This is used in case option “DELIM” or “POSITION” is opted in field IN.DATA.DEF. o Field “FIELD.LENGTH” is a mandatory field in case field “IN.DATA.DEF” has value of “POSITION” and for other cases this is a no-input field. o Field FIELD.ATTRIB has two options viz. CONSTANT and ROUTINE. When option “CONSTANT” is selected the value in the field “FIELD.VALUE” will be mapped to the application field as constant value. In case option “ROUTINE” is selected then the name of the routine should be defined in the field FIELD.VALUE. The name of the routine should be preceded by “@” and should be defined in the PGM.FILE as type “S”. o Field FIELD.VALUE is used in conjunction with the field FIELD.ATTRIB. If FIELD.ATTRIB is provided as CONSTANT, then the value in this field is used for all incoming records. o Field POST.UPDATE.ROUTINE is used to perform the post update processing for a record. In applications like LD which are single authoriser , this routine will be used to perform the authorisation of an unauthorised record.
o Field OFS.SOURCE is used to define the OFS.SOURCE record to be used when uploading the data into T24 via OFS. Two OFS.SOURCE records DM.OFS.SRC and DM.OFS.SRC.VAL are provided. In cases where the field validation needs to be triggered for incoming OFS message then the DM.OFS.SRC.VAL option need to be used. Service Control
Table: DM.SERVICE.CONTROL
Purpose This work-file application is used to control the actual upload process. The following details must be provided when the DM.SERVICE.CONTROL is set up
o Upload Company – The Company for which the data is uploaded. If this information is not provided in the incoming tape, it must be provided here. o Flat file directory and file name – A valid directory name and file name. o Number of Sessions – The number of sessions with which the TSA.SERVICE must be run. o Run Status – To Request for a STOP of the TSA.SERVICE.
Setting up the Service Control for the first Time The ID of the DM.SERVICE.CONTROL DM.MAPPING.DEFINITION table.
file
must
be the
same as
the
After setting up the above mentioned fields, the record must be committed. This must be done in the INPUT mode. Pick up the newly setup record in Verify Mode and commit. When running for the first time, the system will check if the BATCH, TSA.SERVICE and TSA.WORKLOAD.PROFILE have been created. This is checked in the file DM.SERVICE.CONTROL.CONCAT. If they don‟t exist, they are created by the system. The following components are created
Batch A batch record with the id as DM.SERVICE-<
> is created. This contains three jobs namely DM.SERVICE.PRE.PROCESS – A single thread process which reads the incoming sequential file and writes to DM.SERVICE.DATA.FILE DM.SERVICE – A multi threaded job which reads the DM.SERVICE.DATA.FILE and updates the job list. Each record is then processed using OFS.GLOBUS.MANAGER and the error output is logged into <>.LOG.<> DM.SERVICE.POST.PROCESS – A single threaded process which consolidated all the LOG files into 1 single error output file.
TSA.WORKLOAD.PROFILE record The number of sessions in the DM.SERVICE.CONTROL record is updated as number of agents in this record
TSA.SERVICE record A TSA.SERVICE record with the following contents is created.
As can be seen, the user in the TSA.SERVICE record is hard-coded to DMUSER. Creation of DMUSER user record is part of the installation procedure. On verification, the status of the DM.SERVICE.CONTROL record is marked as “START”. The same is updated into TSA.SERVICE file as well. The control log is updated with the starting request details.
Service Status
This enquiry provides the information for all Data Migration services as well as the TSM status. Once the TSA.SERVICE has been marked as START, if the TSM is running in phantom mode the t SA will be automatically started. If the same is running in debug mode, then the agents will have to be manually initiated to start the process. Marking the service Stop To mark the DM.SERVICE stop, the DM.SERVICE.CONTROL record must be opened in Input mode and the RUN.STATUS must be set to STOP.
Immediately the TSA.SERVICE is marked for STOP. The agents will stop after processing the current records.
Modifying the number of agents at Run time To modify the number of agents during run time please pick up the DM.SERVICE.CONTROL record in Verify mode. Update the number of sessions as required. This will be updated to the TSA.WORKLOAD.PROFILE application. Once the TSM reaches its review time, it will either spawn more or less agents as required ( in Phantom mode) or request for running more or less agents ( in Debug mode) Right now this option is available only in Classic
Starting a Service / Resuming a Stopped service To mark a service as start or to resume a stopped service, just run the DM.SERVICE.CONTROL in the Verify mode. Upload Details Menu The upload details Menu has enquiries to indicate the successfully uploaded scripts for every application. For instance, when we run the enquiry for Internal Accounts the details regarding the total no. of internal accounts that are loaded can be found.
Record Details Menu The Record Details menu contains enquiries to view the status of the record before and after the upload. Script Count This enquiry is a health check enquiry. It is used before uploading the scripts. It can be used to derive the flat file details like the script count and the size.
ORD Details This enquiry is used to get the OFS Request Details. The no of records uploaded/failed can be viewed using this enquiry. A sample of the screen shot is as shown.
Performance Monitoring
Table : JOB.TIMES
Purpose : This core application provides details about the number of records processed and the throughput of processing
The same can be viewed in the enquiry JOB.PROGRESS
Exception Handling
1. Only the OFS Upload supports the Exception handling.The Error file can be found in the same directory as of the Uploaded file. There are “n” files created, one for each session.
The id of the created file is <>#LOG.<>. And finally all the files are merged into one single file CUSTOMERERROR.txt. 2. The contents of the Error file is as follows * This file will be returned to the legacy system for verification, correction and then uploaded again by just removing the first 2 parts, thereby retaining only the corrected raw data that will be uploaded in to the System 3. This guidelines explains the data migration of Miscellaneous Deals (MD) to T24 for legacy system. 4. Data Takeover for Guarantees Issued
5. Generic Guarantee, Bid Bond, Performance Bond, Stand By LC and Shipping Guarantee 6. The following details are required for the takeover of Guarantees. 7. Customer – Customer number of the customer against whom the contract should be created. Should be a T24 customer number. 8. Currency– The currency of the guarantee. Must be a valid currency code in the T24 Currency application.
9. Principle Amount – The current outstanding guarantee amount. 10. Deal Date– Date on which the deal was struck. This date can be different from the contract start date. It is used for information only. Standard T24 date format. 11. Value Date – This field indicates the Start date of the Guarantee. The tenor of the Guarantee is the period between the Value date and Maturity date of a Guarantee. This will be a back value date for the migration deals. Standard T24 date format. 12. Maturity Date – The maturity date of the contract. Must be the maturity date from the legacy system. For fixed maturity contracts the maturity date should be provided in this field. For Notice contracts the number of days to notice should be specified. Standard T24 date format. 13. Contract Type – Defines the type of contract, whether Contingent /Memo Asset or Liability (i.e. CA, CL, MA or ML). For guarantees issued the type should be CA, which is Contingent Asset. 14. <> 15. Deal Sub Type – It further classifies the contract and also assigns a range of category code for each sub type like BBOND (Bid Bond), PBOND (Performance Bond) etc. The Deal sub types and the related category codes are defined in MD.PARAMETER table. 16. <> 17. Category – Contains the category code of the contract. The category code should be in the range specified by the deal sub type specified in MD.PARAMETER. 18. <> 19. Limit Upd Reqd – Flag to specify if the limits should be used for this contract or not. Use ‘YES’ or ‘NO’. For Guarantees Issued, this would normally be YES. 20. <> 21. Advice Reqd – Option to suppress the delivery messages when the transaction is created/amended. It should be set to Y if advices/messages are required for Issuance and amendment of the contract. If advices are required, it is recommended to set it to Y, so that future amendments such as principal increase or
decrease will generate advices. The delivery messages generated during takeover can be suppressed using delivery parameters. 22. <> 23. Auto Expiry – Deal level decision field for controlling the expiry of the Deal past its Maturity date., effectively the reversal of Contingent entries. If set to ‘YES’ the deal will expire on the date specified in the MATURITY.DATE. If set to NO, the STATUS of the Deal will remain ‘CUR’ even past the Maturity Date and hence requires manual expiry at the appropriate time. Value defaults from EXPIRY.MODE in MD.PARAMETER. 24. <> 25. Note: Where the Bank’s practice is to mature the guarantees only when the original guarantee is returned, the take over data will contain Guarantees/deals with already past maturity date. In such cases, care should be taken to ensure that this field has the value ‘NO’ during migration. 26. Liquidation Mode – Defines the method of in which overdue payments are handled. Usually set to automatic. Refer to helptext for details. 27. <> 28. Events Processing – Specifies when the principal movement changes as well as raising of contingent entries take effect – i.e. ONLINE or ENDOFDAY (during COB). 29. <> 30. Receiving Bank – The correspondent bank to which the guarantee messages are addressed/sent when it is an Issue. Must be a valid customer in T24 customer application. 31. Benef Cust 1 – Hold the customer number of the beneficiary. Must be a valid customer in T24 CUSTOMER application. 32. Ben Address – The beneficiary address if beneficiary is not a customer of the bank. 35 free format characters multi valued field. 33. Note (1): If there are any scheduled charges (linked to FT.COMMISSION.TYPE) to be taken over during Migration, the details of these future date charges can be taken over using the following multi- value set of fields:
34. Charge Date – The date on which the scheduled charges should be debited to the Charge Account. Standard T24 date format. 35. Charge Curr – The currency in which the charges detailed in the following fields are taken. The input should be a valid record in the CURRENCY file. 36. Charge Account – The account to be debited for the Charge amount. The account should be in the charge currency. Must be a valid account in T24. 37. Charge Code – The FT.COMMISSION.TYPE record to identify and calculate the charges for debit to the customer’s account specified in the Charge account field. This field forms sub-value along with the field Charge Amount. 38. <> 39. Charge Amount – The amount of charges to be debited to the Charge account on the charge date. Care should be taken to input the correct amount during migration, as noninput in this field will result in the default charge amount (which may be different from the amount to be taken over). Standard T24 amount format. 40. Note (2): If there is any Provision/Cash Collateral/Margin already taken for the Guarantees, it is advisable to suggest that these margin and their reversals be handled manually for the existing deals so that they are not taken over. If it is required to take over the Guarantees with the link to the existing Margin amount, then the same should be done using the following fields. Please note that the Outstanding Margin account will be taken over only to the Internal account specified in the MD.PARAMETER. 41. Provision – This field should be set to YES for the migration deals for the Guarantees, which are being taken over with the current outstanding margin amount linked to the respective guarantees. 42. Prov Dr Account – The Migration account into which the Customer account balances are taken over should be given as the Provision Debit Account for ‘takeover’ purposes. Must be a valid T24 account. 43. <> 44. Prov Amt – The outstanding Provision (i.e. Margin or cash collateral) amount relating the Guarantee being taken over should be input here. Standard T24 amount format.
45. Note (3): When the Guarantees are being taken over along with the Margin (Provision) amount, please ensure that the balances in the accounts where these margin amounts are held are not taken over during the Migration of Account Balances. 46. Note (4): Care should be taken to input the actual account of the customer (from which the Provision/Margin was originally taken) in this field ‘Prov Dr Account’ – after the migration of guarantees is complete. This is necessary for the system to credit back the Provision/Margin amount released subsequently to the customer’s account, failing which this amount will be credited to the Migration Internal account. It is therefore necessary to take the Customer’s account number (which had originally been debited for the Provision/Margin) along with the migration data so as to re input the same in the deals after Migration. 47. Data must be extracted in the following format.
Name of Field
Legacy Characteris system Correspon tics ding field
CUSTOMER
10N
CURRENCY
3A
PRINCIPAL AMOUNT
19N
DEAL DATE
8N
VALUE DATE
8N
MATURITY DATE
8N
CONTRACT TYPE
2A
DEAL SUB TYPE
5A
CATEGORY
5N
LIMIT UPD REQD
3A
RECEIVING BANK
15N
BENEF CUST1
10N
BEN ADDRESS
33AN
CHARGE DATE
8N
CHARGE CURR
3A
As from CHARGE ACCOUNT legacy system CHARGE CODE
10A
CHARGE AMOUNT 19N
48. It is preferable to have the data in comma-separated format. When provision is taken must be provided too 49. PROVISION, PROV.DR.ACCOUNT, PROV.AMT 50. 51.Data Takeover for Guarantee Received
52. Guarantee Received – Generic and Covering Facility 53. For take over (i.e. Data Migration) of Guarantee Received the above settings hold good. The significant changes in data will be the CATEGORY, DEAL.SUB.TYPE, CONTRACT.TYPE (should be CL), RECEIVING.BANK and limit details. In guarantee received, the limit might not be used (i.e., LIMIT.UPD.REQD will be set to NO). The RECEIVING.BANK is the bank to which any acknowledgement messages are sent. 54. Note: Normally, migration of scheduled charges or Provision/Margin may not be relevant for most of the Guarantees Received business scenario. 55. <> 56. 57.Data Takeover for External Asset Register
58. Real Estate Register and Art & Antiques Register 59. Data Migration) of External Asset Register records can be done using the above specifications. The primary variations in data will be for the fields CATEGORY, DEAL.SUB.TYPE, CONTRACT.TYPE (should be MA) and limit details. In
guarantee received, the limit might not be used (i.e., LIMIT.UPD.REQD will be set to NO). 60. Note: Normally, migration of scheduled charges or Provision/Margin may not be relevant for Memo Asset deals. 61. <> 62. 63.Data Takeover for Multi-Party/Participation Guarantees
64. Multi-Party/Participation Guarantees 65. Data Migration of Multi-Party (issue) or Participation (recd.) Guarantees is similar to other guarantees data migration. In addition to the various fields specified above, the fields PARTICIPANT and AMT.PARTICIPANT should be updated. 66. Participant 67. Additional participants in the current Guarantee issued. Must be a valid customer in T24. 68. Amt Participant 69. Additional Participant’s guarantee amount must be input in Standard T24 amount format. 70. This guidelines explains data migration of Customer Accounts data from an existing legacy system to T24 Core Banking System. 71.Data Takeover for Account 72.Customer Accounts
73. Account types like Savings, Current, Vostro accounts fall under this category. Details required for takeover of these types of Accounts into T24 are: 74. Alt Acct Id – The original account number from the legacy system must go into this field. 75. Customer – Identifies the customer to whom the account belongs. 76. Account Name 1 – Name of the Account, valid 35 alpha numeric characters. 77. Account Name 2 – Name of the Account, valid 35 alpha numeric characters.
78. Short Name Short Name of the account, valid 35 alpha numeric characters. 79. Category – The product type for the account. The list of account products is provided in the Annexure — 80. Currency – The Account currency. List of currency codes is provided in the Annexure — 81. Joint Holder – In case of jointly held accounts, the joint holder’s customer number must be provided here. It is possible to define multiple joint holders. The joint holder must be a defined customer in T24. 82. Relation Code – The nature of relationship in the jointly held account. Must be a defined relation code as in Annexure — 83. Passbook – Whether this account uses passbook, this must be set to Y. Otherwise ‘’. 84. Posting Restrict – Any special conditions to be applied on the Account. The following is the list of posting restrictions 85. Posting Restriction
Description
1
Post No Debits
2
Post No Credits
3
Post No Entries
4
Refer Credits to Supervisor
5
Refer Debits to Supervisor
6
Refer All entries to Supervisor
7
Account on Referal List
8
Account on Referal List – CR
9
Account on Referal List – DR
10
Post Credits to Savings a/c
11
Post Debits to Current a/c
12
Customer Desceased
86. Account Opening date – The opening date of the Account, must be a valid T24 date. 87. Account Mnemonic – The mnemonic of the account must be less than or equal to 10 characters. 88. Data extracted from legacy system must be in the following format: Name of Field
Length
ALT.ACCT.ID
As from legacy system
CUSTOMER
1-10 N
ACCT NAME 1
35AN
ACCT NAME 2
35AN
SHORT NAME
35AN
CATEGORY
4N
CURRENCY
3 AN
JOINT.HOLDER
1-10N
RELATION.CODE
2N
PASSBOOK
1A
Legacy Mapping
POSTING RESTRICT 2N ACCOUNT OPENING DATE
8N
ACCOUNT MNEMONIC
10AN
89. It is preferable to have the data in comma-separated format. 90.Local data for Account
91. Information, which is specific for the bank, is defined in T24 local reference fields. The list of such fields, applicable content is attached. 92. Data for the Account, both the core and local fields must be provided in the order mentioned in comma-separated format. Care must be taken to ensure that the ‘,’ character does not appear within the data provided. 93.Interest and Charge Liquidation Account Link
94. The following information is required to link a different Interest posting account 95. Interest Liqu Acct – The account number, which must be used for Interest posting for this account. Must be an account defined in T24. 96. Charge Account – The account number , which must be used for Interest posting for this account. Must be an account defined in T24. 97. Data extracted from legacy system must be in the following format: Name of Field
Length
ALT.ACCT.ID
As from legacy system
Legacy Mapping
INTEREST.LIQU.ACC As from the legacy T system CHARGE.ACCOUNT
As from the legacy system
98. It is preferable to have the data in comma-separated format. 99.Overdraft Link to Account
100. After creation of Accounts and Limits, the created overdraft limits must be linked to the accounts. While creating the link the following information must be provided. 101. Alt Acct Id – The original account number from the legacy system must go into this field. 102. Limit Ref — Applicable for overdraft accounts. Provide the limit reference and sequence number.
103. E.g. If the overdraft Limit reference is 100 and this is the first overdraft limit then the value must be provided as 100.01 104.
Data extracted from the legacy system must be in the following format
Name of Field
Length
ALT.ACCT.ID
As from legacy system
LIMIT.REF
1N
Legacy Mapping
105.
It is preferable to have the data in comma-separated format.
106.
Limit Link for Export advances
107. After creation of Accounts and Limits, the created overdraft limits must be linked to the accounts. While creating the link the following information must be provided. 108. Alt Acct Id – The original account number from the legacy system must go into this field. 109. Limit Ref — Applicable for export advances accounts. Provide the limit reference and sequence number. E.g. If the overdraft Limit reference is 500 and this is the first overdraft limit then the value must be provided as 500.01 110.
Data extracted from the legacy system must be in the following format
Name of Field
Length
ALT.ACCT.ID
As from legacy system
LIMIT.REF
1N
111.
Legacy Mapping
It is preferable to have the data in comma-separated format.
This user guide explains the data migration of Customer data from an existing legacy system to T24. Data Takeover for Customer This file holds the basic information for any Customer that the bank deals with:
Customers holding contracts and / or accounts
Correspondent banks
Brokers
Guarantors and any other entity that the bank does business with.
The following are the details required to create a customer in T24 Customer Number – Also known as the @ID, this must be a valid number, 1 to 10 digits long, and is used to uniquely identify the customer across the bank. It is also possible to allocate a specific range for special customers of the bank, if the bank has such practice. Mnemonic – An alternative means of referencing the Customer, the value in this field allows easy retrieval of often used customer numbers. Mnemonic must be at least 3 and at most 10 characters long. Alphanumeric values and ‘.’ are accepted but the first character must be definitely alpha. If the bank does not have a specific requirement for mnemonics, the system can be made to generate this. Short Name – Customer Short name, which will be used as enrichment in various applications when the customer number or mnemonic is input. Short name must be at least 3 and at most 35 characters long. & Character is allowed but is replaced with “ “ if short name is used in swift messages. / Character is allowed in the short name but not as the first character. Note: While duplicates are allowed in short name, it is recommended that this field is kept as unique as possible for the user to identify the customer. Name .1 – This is the correspondence name of the Customer. Name 1 must be at least 3 and at most 35 characters long. Validations are similar to Short name and special characters allowed are & , /. When the length NAME.1 is not adequate NAME.2 can be input. Street – Identifies the first line of the Customer’s main address. This field is mandatory input and accepts 3-35 long valid swift characters. Town Country Post Code Country The above 3 fields together with Street record the address of the Customer. Each of these can be 1-35 long valid swift characters. Sector – Customers can be classified by Sectors in T24. This list of valid sector codes is crystallised during the Business Review process and the same as extracted is available
in Annexure —. If the legacy system does not have this classification, the bank may either opt to do this classification before inputting to T24 (recommended) or the system may be requested to put all customers in to pre defined one or 2 codes. Account officer – Identifies the main Relationship Officer responsible for the customer. The account officer information extracted from the legacy system must be mapped and converted to valid T24 Account officer codes before Migration. This field forms the basis for the production of MIS reports and hence needs careful consideration while setting up. List of officers in the bank are collected during business review process is available in Annexure —. Industry – Customers can be classified by Industry in T24. The list of valid Industry codes is crystallised during the Business Review process and the same as extracted from TAABS is available in Annexure —. If the legacy system does not have this classification, the bank may either opt to do this classification before inputting to T24 (recommended) or the system may be requested to put all customers in to pre defined one or two codes Target – Classification code to assess the potential / worthiness of the Customer to the bank. The Target information extracted from the legacy system must be mapped and converted to valid T24 target codes before migration. The list of valid target codes and associated descriptions is available in Annexure —. Customer Status – Identifies the current status of the Customer. The Customer status information extracted from the legacy system must be mapped and converted to valid T24 customer status codes before migration. The list of Status codes agreed during Business review is available in Annexure – Nationality – Identifies the nationality of the Customer Residence – Identifies the country of residence of the Customer. Both these fields should be valid country codes. A list of country codes is in Annexure – Customer Liability – The liability customer must be a valid customer code. When the liability customer is the created customer himself, then the number from the customer id must be provided here. Please ensure that when data is provided, the liability customer is created before the current customer. Date of Birth – The date of birth of the customer must be a valid T24 date. Language – The language of correspondence for the customer. Default to 1 English if only English is available Contact date – The date customer was opened, must be a valid T24 date Text – Multi valued free format narrative field to hold information about the customer.
Introducer – Name of the introducer, 35 character free format text is allowed as input. RELATION information, if required can be uploaded after the successful migration of valid customer information. The bank must provide the Customer information in the following format: Name of Field CUSTOMER.NUMBE R MNEMONIC SHORT.NAME NAME.1 NAME.2 STREET TOWN.COUNTRY POST.CODE COUNTRY SECTOR ACCOUNT.OFFICER INDUSTRY TARGET CUSTOMER.STATUS NATIONALITY RESIDENCE CUSTOMER LIABILITY DATE OF BIRTH LANGUAGE CONTACT DATE TEXT INTRODUCER
Legacy Characteristic system Correspondin s g field 1-10 N 3-10 AN 3-35 AN 3-35 AN 3-35 AN 3-35 AN 35 AN 35 AN 35 AN 4 N 3N 4N 3N 3N 2 AN 2 AN 1-10N 8N 1N 8N 35AN 35AN
It is preferable to have the data in comma-separated format. Local data for Customer Information, which is specific for the bank, is defined in T24 local reference fields. The list of such fields, applicable content is attached.
Data for the customer, both the core and local fields must be provided in the order mentioned in comma-separated format. Care must be taken to ensure that the ‘,’ character does not appear within the data provided. E.g. Customer Address.
Data Takeover for Customer Relation Takeover of customer relationship can be performed only after taking over Customer information. The following information is required
Customer Number – The customer reference for which the relation is defined. 1 to 10 digits long.
Relation Code – Indicates the relation type of relationship this customer holds with the related customer. List of relationships is provided in Annexure —.
Rel Customer – The customer reference of the related customer.
It is possible to define multiple relationships for a customer. Note : When taking over customer relation ship, it is sufficient to define one side of the relation ship. The reverse side is automatically derived and populated by T24. When taking over relationship, data must be provided in the following format. Name of Field
Legacy Characteristic system Correspondin s g field
CUSTOMER.NUMBE 1-10 N R RELATION.CODE 2N REL.CUSTOMER 1-10N It is preferable to have the data in comma-separated format.
Data takeover for Customer Address This file holds the Customer address. This includes both
Additional Print address. The default print address is automatically copied over from the customer table.
Swift address of the customer.
The following are the details required for additional print address, Address Key – The id of the DE.ADDRESS application. The must be of the following format <>. C-<> . <>. <> The CARRIER must be PRINT. The address number must be 2-9. Short Name – Customer Short name, which will be used as enrichment in various applications when the customer number or mnemonic is input. Short name must be at least 3 and at most 35 characters long. & Character is allowed but is replaced with “ “ if short name is used in swift messages. / Character is allowed in the short name but not as the first character. Name .1 – This is the correspondence name of the Customer. Name 1 must be at least 3 and at most 35 characters long. & Character is allowed but is replaced with “ “ if short name is used in swift messages. / Character is allowed in the short name but not as the first character. When the length NAME.1 is not adequate NAME.2 can be input. Validation rules are similar to NAME.1, except that NAME.2 is not mandatory. Street Addr – Identifies the first line of the Customer’s main address. This field is mandatory input and accepts 3-35 long valid swift characters. Town Country – Identifies the Town and country of Customer’s residence. Post Code – Identifies the postcode of Customer address. 1-35 long valid swift characters. NonMandatory input. Country – Identifies the Country of Customer address. 1-35 long valid swift characters. Non- Mandatory input. The bank must provide the information in the following format Legacy Characteristic Name of Field system Correspondin s g field ADDRESS KEY 30AN SHORT.NAME 3-35 AN NAME.1 3-35 AN NAME.2 3-35 AN STREET.ADDR 3-35 AN TOWN.COUNTRY 35 AN POST.CODE 35 AN COUNTRY 35 AN
It is preferable to have the data in comma-separated format. When taking over swift addresses the following information is required @ID – The id of the DE.ADDRESS application. The id is of the following format <>. C-<> . <>. <> The CARRIER must be SWIFT. The address number must be 1-9. Delivery Address – Swift address of the customer. 8 or 11 characters for normal customers and 9 or 12 characters for companies. Short Name – Customer Short name, which will be used as enrichment in various applications when the customer number or mnemonic is input. Short name must be at least 3 and at most 35 characters long. & Character is allowed but is replaced with “ “ if short name is used in swift messages. / Character is allowed in the short name but not as the first character. Name of Field
Legacy Characteristic system Correspondin s g field 30AN 3-35 AN
ADDRESS KEY SHORT.NAME DELIVERY.ADDRES 12N S
It is preferable to have the data in comma-separated format.
Data takeover of BIC Codes Holds the list of BIC codes. The following fields are required to be passed for takeover @ID The BIC Id. Must be 8 or 11 characters long. Tag Tag identifier. Institution Name Name of the institution. 35 characters text field. City
Name of the city, input is 35 characters text field. Sub Type The type of financial institution, 4 characters long. Services This field identifies the Value Added Service(s) the financial institution has subscribed to, 20 characters long. Address Address of the BIC, input is 35 characters long. Location The location of the BIC, input is 35 characters long. POB Number Post office box number for the BIC code, input is 35 characters long. POB Location Post office box location, input is 35 characters long. POB Zip This is the zip code for the Post Office Box that the financial institution may use, input is 35 characters long. POB Country The country where the Post office boxes reside, input is 35 characters long. The bank must provide the BIC information in the following format Name of Field
Legacy Characteristic system Correspondin s g field 8A or 11A 2AN
BIC KEY TAG INSTITUITION.NAM 35AN E CITY 35AN
SUB.TYPE SERVICES ADDRESS LOCATION POB.NUMBER POB.LOCATION POB.ZIP POBCOUNTRY.COD E
4AN 20AN 35AN 35AN 35AN 35AN 15AN 2A
It is preferable to have the data in comma-separated format.