Statalist


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: st: RE: Group building according to given report dates and company


From   "Hua Pan" <[email protected]>
To   [email protected]
Subject   Re: st: RE: Group building according to given report dates and company
Date   Fri, 20 Mar 2009 17:46:52 +0100

Dear Nick,

thank you very much for your suggestion. Just as you adviced, I also want to make a subdivision into time periods, not smaller datasets. But I just wonder how can I do that.Maybe you have an good idea?
My dataset after merge A and B look like this:

permno	date	         repdats	    ret
10002	04012004	04292004	.018296152
10002	04022004	07272004	.010668194
10002	04052004	10272004	-.000555568
10002	04062004	01242005	-.018343519
10002	04072004	04222005	.018686293
10002	04082004	07202005	-.016120011
10002	04122004	10122005	-.028248586
10002	04132004	01202006	-.012790768
10002	04142004	04212006	-.007656017
10002	04152004	07212006	-.001780456
10002	04162004	10202006	.029131973
-----------------------------------------------------
10002	04192004	10202006	-.030040355
10002	04202004	10202006	.000595493
10002	04212004	10202006	.006547655
...     ....               ...
10002   12312006        10202006

It looks very strange, cause I have only 11 report dates, but around 700 dates of return for one permno. So the last repdats 10202006 repeats for the rest of observation. 

Best Regards

Hua




-------- Original-Nachricht --------
> Datum: Fri, 20 Mar 2009 15:13:33 -0000
> Von: "Nick Cox" <[email protected]>
> An: [email protected]
> Betreff: st: RE: Group building according to given report dates and company

> There is a lot of detail that you are expecting members to upload here. I
> have not tried, but I have some broad strategic advice, exactly as before. 
> 
> There is no _obvious_ need here for subdividing into smaller datasets
> followed by recombination. 
> 
> What is most evident is that you have a subdivision into time periods. So,
> create a variable that is 1, 2, etc. for your distinct time periods and
> then use that new variable -by:- as in previous problems. 
> 
> One of the easy details about -by:- is that you can define blocks of
> observations on as many criteria as you wish. 
> 
> Nick 
> [email protected] 
> 
> Hua Pan
> 
> Now I have a big problem and have thought about it over and over again and
> can’t find the way to solve it. Maybe someone here has can help me.
> 
> I have two datasets, say A and B. A has permno (identify nr.), daily
> return and date. B has permno, repdate (report date). 
> A
> permno	   date             ret 
> 10001	01.Apr.2004	 .01793105
> 10001	02.Apr.2004	-.046070479
> 10001	05.Apr.2004	 .022727251
> 10001	06.Apr.2004	-.005555551
> 10001	07.Apr.2004	 .016759828
> …    	…		    …
> 10001   31.Dec.2008         …
> 
> 10002	01.Apr.2004	.018296152
> 10002  	02.Apr.2004	.010668194
> …	     …              …
> 10002	31.Dec.2008	     …
> 
> 10003   01.Apr.2004          …
> …	    …		      …	
> 93105       …                 …
> 
> B:
> Permno    repdate          
> 10001  	  21.Apr.2004
> 10001     20.July.2004
> 10001     26.Oct.2004
> 10001     03.Feb.2005
>           04.May.2005
> …             …
> 10002     23.Apr.2004
> 10002     22.July.2004
> …              …
> 93105         …
> …
> I wish to calculate cumulative daily return for each company and for each
> period, from the next two days after the report day to the day before next
> report day e.g. for permno 10001, I want to calculate the cumulative return
>  for periods: (21.Apr.2004 +2= 23.Apr.2004, 20.July.2004 -1=19.Juli.2004),
> (22.July.2004, 25.Oct.2004), (28.Oct.2004, 02.Feb.2005)……for permno
> 10002, from 23.Apr.2004+2+1=26.Apr.2004 (because 25.Apr.2004 is Sunday, so I
> have to plus one day), to 21.July.2004……
> (cumulative return for each day within period are to be calculated, almost
> 60 for each period)
> 
> At first I tried to get dataset A and B together with merge, but not
> successful.
> use "C:\A.dta", clear
> sort permno date
> save "C:\A.dta", replace
> use "C:\B.dta", clear
> sort permno repdate
> merge permno using "C:\A.dta"
> keep if _merge == 3
> drop _merge
> save "C:\AB.dta”
> 
> Many of date and returns of A are deleted. Maybe I should try joinby, or
> just change A as Master Data? How can I put them orderly together? 
> 
> Then I wish that I can build several groups according to the permno and
> report date: 
> group1: permno 10001 with daily return from 23.Apr.2004 to 19.Juli.2004 
> group2: permno 10001 with daily return from 22.July.2004 to 25.Oct.2004
> group3: permno 10001 with daily return from 28.Oct.2004 to 02.Feb.2005
> ….
> group n: permno 10002 with daily return from 26.Apr.2004 to 21.July.2004
> …..
> group m: permno 93105 with…..
> 
> After I drop the Observations that don’t belong to any group, I can
> calculate cumulative return for each group, (thank Austin again for this code)
> 
> bys group: g cumul=1+ret if _n==1
> replace cumul=(1+ret)*l.cumul if mi(cumul)
> replace cumul=cumul-1
>  
> Finally I drop the group with missing value and calculate the mean
> cumulative return for those that have the same group number
> by group, sort: generate nr=_n
> by nr, sort: egen mean_cumul=mean(cumul)
> 
> So the question is how can I put two dataset together and then put the
> observations into groups according to permno and report date and pay attention
> that the period begins and ends with work day. When I solve this problem,
> then I can calculate the mean cumulative return. 
> Any suggestion would be greatly appreciated. Thank you very much.
> 
> *
> *   For searches and help try:
> *   http://www.stata.com/help.cgi?search
> *   http://www.stata.com/support/statalist/faq
> *   http://www.ats.ucla.edu/stat/stata/

-- 
Aufgepasst: Sind Ihre Daten beim Online-Banking auch optimal geschützt?
Jetzt absichern: https://homebanking.gmx.net/[email protected]
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/



© Copyright 1996–2024 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   What's new   |   Site index