Scenario Data Update Steps:
What follows is an guide for updating the data used in stress testing at TD.
1 After every QEF meeting (September and December), run the following EViews program which will import the relevant data into a work file:
O:\tdecon\Risk_team\QEF_Data\qef_data.prg
Create a database in Eviews and name it QEF_Data.edb
. Store the data pulled into the work file by qef_data.prg
into the QEF_Data.edb
database and place the database into the respective quarter folder (e.g. place in the September folder for the September QEF meeting) located in O:\tdecon\Risk_team\QEF_Data\
.
QEF_data.edb
contains the data the stress team uses to fill in any missing data for the rest of the year and for _tdbs (TD Baseline Scenario) forecasts.
RESULT: a database called QEF_Data.edb
located in O:\tdecon\Risk_team\QEF_Data\September
(if the month is September)
2 Back Up and then Update EViews database O:\tdecon\HDM\Riskdata\RISKDATA\riskdata.edb
:
(2a) Copy the old O:\tdecon\HDM\Riskdata\RISKDATA\riskdata.edb
database into a folder with the date as its name into O:\tdecon\HDM\Riskdata\RISKDATA\Archive\
(2b) Update the database by running O:\tdecon\HDM\Riskdata\RISKDATA\riskdata_1.prg
. EViews will say "Uncompressing file".
(2c) Create a different database for each of the stress tests (ccar/ewst/mst) and store. Each of these is a copy of riskdata.edb placed into a folder named with the stress test and the year (eg CCAR_2025
for CCAR in 2025).
Note
The program files to run are located at O:\tdecon\Risk_team\Riskdata\Programs
(2d) All data upload files (i.e. EViews work files) have TWO tabs. Upload the data in the FIRST tab to the risk database located at O:\tdecon\HDM\Riskdata\RISKDATA
and named riskdata.edb
Note
There are actually four database files: riskdata.e0, riskdata.e1a, riskdata.e1b, riskdata.edb
but we only need to deal with the .edb
.
-
The
first tab
in the workfile contains data until the end of the current year (including actuals and baseline forecasts where actuals are missing). -
The
second tab
has the alias_tdbs
, this contains the baseline for our scenario and needs to be uploaded to all stress testing databases.
QUESTION: Is the _tdbs tab data to the databases located in CCAR_2025 etc created in O:\tdecon\HDM\Riskdata\RISKDATA\
?
Note
How to upload from a workfile to a database: Select data in the work file, right click and select 'store to database' and then select the required database.
3 The following concerns O:\tdecon\Risk_team\Riskdata
(3a) Update O:\tdecon\Risk_team\Riskdata\zzz_International.xlsx
with data given from the Fed (should be saved in the supervisory folder till December (Same for foreign exchange file)
Note
The supervisor folder is located at O:\tdecon\Modelling\Stress_Test\Fed_Stress_Test\CCAR_2025\Supervisory
Note
In addition see O:\tdecon\Modelling\Stress_Test\Fed_Stress_Test\CCAR_2025\Supervisory\Scenario_Data
for the FED data.
(3b) Update the data files: O:\tdecon\Risk_team\Data_Requests\2025\Data\zzz_riskdata
and O:\tdecon\Risk_team\Data_Requests\2025\Data\zzz_riskdata_eop
Note
Update the files by looking to the place the data is coming from into Excel.
- Open,
O:\tdecon\Risk_team\Data_Requests\2025\Data\zzz_data.xlsx
This file contains the risk mnemonics and Haver/other program locations for all the data we send.
Note
_Q is quarterly data. _M is monthly data and SA needs to get seasonally adjusted. All instructions are also listed on the "Legend" tab of zzz_data.xlsx
.
- Update CoStar data by downloading data from
https://www.costargroup.com/ccrsi
. Copy data from "Regional" tab, "Value-weighted" columns.
QUESTION: where do I copy the CoStar data to? Is it copied to zzz_data.xlsx
?
- Get bbb10y values for the quarter that passed and round it to one decimal place : ROUND('O:\tdecon\HDM\HIST\Financials\[BofAML_BBB.xlsx]Daily Calculations'!E1538, 1)
QUESTION: do we take E1538 or the last number in the column?
-
To get the bbb10y data:
-
Go on the Daily Tab
-
Get data from Bloomberg terminal – type "IND" and select "IND13 Data Download"
-
Type
C7A4
andC4A4
and choose "Yield to Maturity (Conventional)" -
Copy data to BBB10Y tab in zzz_data
-
Make sure we have the most up to date history given by the FED (data should extend until the end of the previous year)
-
Get Mortgage rates from
O:\tdecon\HDM\Riskdata\zzBloomberg Financials\[FinVarsBloombergCodesData.xlsx]Quarterly Averages
-> refresh Bloomberg terminal and double check by manually calculating average and copy past "values" to the quarterly tab then drag the formula on MTG tab inzzz_riskdata
-
MSCI data from
O:\tdecon\HDM\Riskdata\zzBloomberg Financials\[FinVarsBloombergCodesData.xlsx]MSCI
-> refresh Bloomberg terminal, then drag down the formula on MSCI tab inzzz_riskdata_eop
-
Run R code for fipi data and update working directory (line 7).
-
Note
The R code is stored at O:\tdecon\Risk_team\Riskdata\Programs\Stats_Can_fipi_data_extract.R
Note
The data is very lagged (missing up to 2 quarters usually). Copy the data into the zzz_riskdata
file under the FIPI
Tab
(3c) The regionals file O:/tdecon/Risk_team/Riskdata/zzz_regionals.xlsx
:
-
All Moodys tabs: powertools > account > clear > force refresh > workbook. Then make sure that the data is up to date on the UPD tabs (you'll need to drag down the data each quarter with correct date)
-
Update the green tabs and upload to the yellow tabs (yellow tabs are the ones that are updated in terms of calculations)
QUESTION: Is the data to be copied into the regionals file or from the regionals file into another file?
4 Open the Programs folder in : O:\tdecon\Risk_team\Riskdata
and run the codes therein:
(4a) For ALL codes below:
- FIRST open the file and change the last date of the program to the next year (e.g. 2032 would become 2033) and change jump-off year (change 2024Q2 to 2025Q2)
- KEEP IT AS Q2 because we have some Q3 available and some not, wherever it is missing we use baseline data)
(4b) First run O:/tdecon/risk_team/Riskdata/us_riskdata.prg
then O:/tdecon/risk_team/Riskdata/ca_riskdata.prg
, then O:/tdecon/risk_team/Riskdata/Extra/data_update_int_vars.prg
(NOTE: DO THIS IN THE ORDER GIVEN!)
- Update dates to the quarter we are in/locations. Make sure series/alphas are set to "Average Observations" and check mark "no conversion of partial periods"
Note
These options are located in Eviews at options > general options > series and alphas
. Use for monthly data but DO NOT USE for daily data.
- Check whether or not we need to update the location on QEF page (i.e. the location of other economists workfiles) if it exists
Note
for ca_riskdata
update the string "data" line 36 and make sure QEF data location is up to date in QUARTERLY page ** QUESTION:These are all Eviews pages?**
(4c) Run data_update_sum
, make sure series/alphas are set to "Sum Observations" and check mark "no conversion of partial periods"
(4d) Run data_financials
-
Update dates/locations. Make sure series/alphas are set to "Average Observations" and check off "no conversion of partial periods"
-
Update the following highlighted number, it should be available till before the quarter we are currently in, open the file it will be linked to another file, update that
- Make sure all quarterly data is available for us5y and us10y
- Make sure all quarterly data is available in Haver for international yields (all days of the quarter sometimes this data comes in the second week)
- Especially au3m as it is daily data updated monthly
-
Germany r134m1y@intdaily
Japan r158m3m@intdaily
UK r112m5y@intdaily
Australia r193zaq@intdaily, r193ma@intdaily
-
(4e) Run vixmax
, make sure series/alphas are set to "Max Observations" and check off "no conversion of partial periods".
(4f) Run riskdata_eop
, make sure series/alphas are set to "Last Observations" and check mark "no conversion of partial periods"
- Make sure all quarterly data is available to exchange rates
- Run
us_regionals
andca_regionals
, make sure series/alphas are set to "Average Observations" and check mark "no conversion of partial periods".- Make sure any data we are getting on the QEF page from the economist's wf directly is up to date (might need to change location or quarter alias)
(4g) US regionals: double check they are seasonally adjusted to latest quarter in haver: TD5 -> "H SA Core Logic HPI" if not use this location to update: O:\tdecon\HDM\SA_CoreLogic UPD.xlsx
(4h) Lastly, we have volumes, run us_volumes and ca_volumes, make sure series/alphas are set to "Average Observations" and check mark "no conversion of partial periods".
- For CA_Volumes update the following location/alias: