Post on 20-May-2020
transcript
� 2014 Žſ� ĝǩʓƽ
� � � ǧȟƪǭqŚɉŏĕqɘʄśȫn¸£¿Ã�
� � BMODIS EVI o��ĻŁqǧȟuqʱȠB
� ˠȭœś� Ȝśʶ� ƒȠȜśȷ� ƙŊȷś�Å� � śɍȦĭ˯B113K289Y � � � � � ǘ� ȞȞ
� � � ưąǂ:� 2015 Ž 2 ǒ 6 ǂ
� � ɻǃ � ȵdfSâ�kN�YqŅțkrI×˂qȂĕo�gjIȈnjĖ~ȱȎĖnmqȝŌ
ĸ˜SŽKĢ]UngjTjN�Jbqd|IŅțʁȉɵLJ�ƣfÉXjIŅțqˈI
ȄIœǺIǧȕqǬŘnm�ʁȉ_�ƐɻSM�JʁȉɗǡrIɵLJȣö£Å�l]j
ưé[�IŅțȝŌo˃_�ȲȽ~ƟKqȟȂnmǬKnĆʻoČȠ[�jN�J�
� ʣŽIŅțʁȉɵLJR�ĨƋ[��£Å�rȾ˂ĆʂɨINJ˂Ćʂɨl}oIJÉ]I
ŅțȝŌo˃_�¬ �£Å��ƅƞ_��OongjN�JèPsIŅțʁȉɵLJ
TerraIAqua oƲʟ[�jN��Ã� MODIS rIʁȉȀʿS ��L��� µ��·Å
¤ÀI � �¹Ã¨ÀkŅț�ʁȉ]IǑˠȾ˂ʂöſ ��0 qœǺIŅɶIȄȁo˃_
� � Ȼ˝q°Â��¤� � Ž˨ǒÝ˅ưé]ɚWjN�J�
� ��" q°Â��¤q˧hoǧȟƪǭ�VI,vegetation index�SM�JǧȟƪǭrIǧ
ȕqŔˉúqĦŨȖƕx ʣʙő�ƃUĦŨ �ʕȠNjǧȕqȂĕſ�ƻðĖ]d}qk
M�J� ��" qǧȟƪǭohNjrǑʣ NASA R� Amazon AWS q�¾�¥°¾ ¤
®�Ŷo��¬ �£Å�ʂǞ°Â���¤uq¢�¤² ¥£Å�l]jqưé
}ŗy�IŅțȈnjĖIȱȎĖnmqwRoǦǟßƮIɦåȕqĆŸɄqʛʣnĸ˜u
qȂȠSǕƈ[�jN�J�
� ǧȟƪǭqNJ˂ŏĕq¸£ÀĖorÂ��¢� �˃ƻSȠN���YlSŒNSI
š£Å�ořń_�©��IDzɲnmqĸ˜o�gj£Å�µ�§Ã��ƑIJ]dœʼ
ɪĕʄɇ�ɴOYlSː]RgdJY�oŧ]jIİű����wRkǑŬǿl MAP ǿ
Ʉo��ɘʄśȫ¸£¿Ã�o��˚òĖSǨʅ[�IȾ˂ʂöſqáNǧȟƪǭ
�GIMMS NDVI, 1km�oŧ_�š˟kbqĔǡSǨʈ[�dJ[�oɺĐ�������
Ʉ�ĆƺĄȜo��ĄȜqˠʨĖSǨʅ[�dJ�
� ǗȲȽkrIǧȟƪǭŏĕqɘʄśȫ¸£¿Ã�Ƣǿ�ŅțȝŌo˃_�¬ �£Å
�ʂǞoȠN�d|oIùɴȲȽqƞǡoʼniNjIMODIS �Ã�o�gjƋ��d
ĻŁŅǀqǧȟƪǭ£Å��©��ɥƕqƃNɘʄśȫnƢǿ���!ǿ�kĘĆĖÂ��
¢� �˃ƻl]j¸£¿Ã�]Ibq¸£À«¾·Å�� 90*'15 ǿk�¾��¿Ã
�_�Ylo�gjǧȟqNJȾ˂ŏĕ��ɻɐÄĪɿĖ]dJš˟qɗǡIżņȫn�
ÀÅ°lY�R�ʬɩ_�˩�ÀÅ°qįʄ � �ÀÅ°o�¾��¿Ã�[�SIǧȟ
q��°��{]�¸£¿Ã�qŖƸ~ǧȟqřń]nN£Å�qƆ˗�ƃUĩWd
ɗǡln�Iǧȟlqŧǹrː]NɗǡongdJY�ohNjr£Å�ʳċI¸£¿
Ã�ƢǿɄqƶĺSƐɻkM�SI£Å�µ�§Ã��ƑIJ]jq£Å�ɻɐlˠdzq
ȭʕ�ɕzį�ajɴOYlSkTdJ�
Ȭdz
1.r^|o...............................................................................................1 2.ŧʖ£Å�………................................................................................2
2.1 ǧȟƪǭ(VI)lr.......................................................................2
2.2 MODIS.....................................................................................4 2.3 ǗȲȽkŧʖl_�£Å�.........................................................6
3. ¸£¿Ã�Ƣǿqʐdž.....................................................................11 3.1 ǧȟƪǭq¸£¿Ã�………….................................................11
3.1.1 ɘʄśȫn¸£¿Ã�ohNjqǨʅ.....................................11 3.1.2 çȠ_�Â��¢� �˃ƻqɓÙ.......................................11 3.1.3 ML ǿo��«¾·Å�qƯŠ..............................................14
3.1.4 MAP ǿo��«¾·Å�qƯŠ............................................16 3.1.5 �¾�¥��Ã¥�ǀƁ.......................................................18
4 š˟......................................................................................................19
4.1 Óôš˟ ................................................................................19 4.1.1 š˟£Å�lš˟ǚÞ.....................................................20 4.1.2 š˟ɗǡ.........................................................................21
4.2 Hadoop o��ĆƺĄȜš˟.....................................................23 4.2.1 NJɏĈqƦą...................................................................23
� � � 4.2.1.1 Ƣǿ………………..........................................................23
4.2.1.2 š˟ǚÞ.......................................................................24 4.2.1.3 š˟ɗǡ.......................................................................25 � 4.2.2 «¾·Å��¾��¿Ã�...............................................27 4.2.2.1 Ƣǿ………………………………………………………….….27 4.2.2.2 š˟ɗǡ…………………………………………………….….27
5.� ɔ��o............................................................................................33 ʔʡ ĤɤƽȘ Ûʾ�
1.r^|o � � ȵdfSâ�kN�YqŅțrI×˂eWkrnUIŒUqĕȕ~ǧȕolgj}
ƭWǐPnNŋơkM�J]R]IʣŽI×˂qȂĕo�gjIŅțqȝŌoœTn
Ɔ˗�ĥx]IȝŌĸ˜SŽKĢ]UngjTjN�J � èPsIŅțȈnjĖo��ǺȈSÉSgj]yNIȄǻ˖qÉDžIȧźǺʖqōēI
ǺïqŏĖIǧȟĆŸqŏĖnmƆ˗�ĥx]jN�JydIœɾǮnǦǟßƮo�
�ȱȎĖ~ǦǟȆūnmIŅțȝŌo˃��ȅĎnĸ˜nmŒƻSM�JȝŌĸ˜o
�gjIǧȕoŧ]jƘƆ˗�ÌPjIəȌ]ygdȻ}M�J
� � ʣŽIYq�Onĸ˜oŧ]j¿¸Å¤�Ã�Ã�nm�ȠNjIŅțʁȉȂĕ�
ɴ��jN�JŅțʁȉɵLJo��¿¸Å¤�Ã�Ã�krIȾ˂ʂöſINJ˂Ćʂ
ɨl}oƕɨSIJÉ]jIŅțȝŌo˃_�¬ �£Å��ƅƞ]jN�JƲʟ[�
jN��Ã�rIĮȀʿŹoQW�ĦŨqƃ[~ƷŨqƃ[�l�P�ɨĒ�Ʃgj
N�J[�oIǺʖ~ǧȟqŏĖ�ʁȉ_�d|orIŠǕȫoʁȉ[�d£Å�S
Ɛɻln�J
� Ǘ Ȳ Ƚ k r NASA o � � ˁ Ȩ [ � d Ī ɿ ˱ ʙ ő ň q Ʒ Ũ ʄ k M �
MODIS(MODerate resolution Imaging Spectroradiometer)q£Å�R�åƞ[�d
ǧȕqȂĕſ�ɶ_ǧȟƪǭ(EVI)�ȠNjIʿǕo�d�ǧȕqŚɉŏĕqŏʲ�
£Å�µ�§Ã�ȫnʁȑR�ȲȽ_�Yl�Ȭȫl_�JùɴȲȽkrǧȟƪǭq
Śɉŏĕ�Â��¢� �˃ƻqĵ�oĆŸ_�ǴɾĆŸ˃ƻk¸£ÀĖ]I¸£À
«¾·Å��ɘʄȫśɣƢǿkM� ML(ǑŬǿ,Maximum Likelihood)ǿl MAP
ˣMaximum a Posterior ǿʕkʂUƢǿ�ưǥ]jN�[Honda 06][1]I[İű 08][2] I[ŰÊ 09][3]JY��qƢǿrDzɲð~©��oƃNlNOȖƎ�ƩgjN�Jİű
(2008)[2]kr˥ǑŬǿl MAP ǿ�ȠNj×Ŵ£Å�oŧ_�¸£¿Ã��ɴgjb
qǓĔƕ�ȳʍ]j[�oˠʨĖqƢǿ�ưǥ]dJŰÊ(2009)[3]krI�ÂŪÀ
n£Å�kqǨʈ�Ȭȫl]jIȾ˂ʂöſ ����0 q ����"���#��kǨʈ]dJ[�
oÉȡ�(2010)[4]kr Xgrid �ȠNdĆƺĄȜ�(16 core)o��ˠʨĖ�IɺĐ
(2014)[5]kr Hadoop MapReduce �ȠNdĆƺĄȜ�(50 ī ×1 core)o��ˠʨĖ�
ɴngdJ]R]IʛʣkȚŅʒǣSĪɨnǧȟq£Å�lŧǹ[anS�qǨʈr
ɴPjQ�`IydIĨ�ą]d¸£À«¾·Å��ȂȠ]dˠdzqȭʕȨɽ}Ǩʅ
[�jNnRgdJ�
� � bYkIǗȲȽkrǧȟƪǭŏĕqɘʄśȫ¸£¿Ã�Ƣǿ�I��Ⱦ˂ʂöſq
ˠN MODIS EVI qCĻŁCq£Å�oʱȠ]jIʛʣnšè�}loǨʈ_�Yl
�ʊz�J[�oƋ��d¸£À«¾·Å�R��¾��¿Ã�o�gjIǧȟqNJ
Ⱦ˂ŏĖ�ĪɿĖ_�Yl}ʊz�JydIš˟krœʼ£Å�ʂǞ�|\]jI
Hadoop o��ĆƺĄȜ�ɴOJ�
�
2.ŧʖ£Å�
2.1 ǧȟƪǭˣVIˤlr
� ŜşR�ŅÉqǬŘ�ʁȉ_�ŋįIŅÉR�qŔˉúqȀʿq˃ƻl]jq
ĦŨș(ĦŨ�³�¤À)~bqðR�ʄɇ[��ðSČȠ[�jN�Jǧȟƪǭ
(Vegetation Index)lrǧȕqɛɳqĦŨ�³�¤ÀȖƕqȖƎ�ČȠ]jǧȕqĆ
ŸȗǾ~Ȃƕſ�ȴ_ƪǭkM�J
� Ŀ 1 rŅțqɶ˖oM�ȕʘˣǻIŃIǧȕˤɄqĦŨș�Ȁʿq˃ƻl]jǮ
Ɓȫoɶ]d}qkM�JYqĿR��R��OoIǧȕrIúqʙɰĪɿ˛ň(R)qȀrĴħ(ĦŨșSáN)]Iʣʙő˛ň(NIR)kqȀrƃUĦŨ_�lNOȖƎ�Ʃ
hJYqȖƎ�ČȠ]jĦŨ�³�¤ÀR�ǧȕ�Ʀą_�YlSkT�Jǧȟƪǭ
qŚɉŏĕIŽŏĕ�ȲȽ_�Ylo�gjIǧȕqʼIȂƕſqŏĕ�ʂǞ_�Y
lSkTIY�o�gjǺïŏĕSǧȟoĥx_Ĕǡ~I×ȫnǧȟqŏĖIydb
qŏĖSȝŌoĥx_Ɔ˗�Ȝʂ_�qoƇɀj�YlSkT�lɤP���J
� � � � � � � � � Ŀ 1. ĮȀʿŹoQW�ǧȕIŃIǻqĦŨqƃ[[6]
� � ǧȟƪǭorNUhRq��°SM�SIÜɶȫn NDVI l EVI qÕhohNj
ɓÙ_�J
B R NIR
� � NDVI( Normalized Difference Vegetation Index )ɇąƁrÝÊln�J
NDVI = NIR− REDNIR+ RED .
� � � (1)
YYk NIR rʣʙőɜňkqĦŨșIRED rʙɰĪɿúɜňkqĦŨș�ɶ_J
NDVI r-1 R� 1 qðkM�SIšˌqŅɶ˖r-0.1 R� 0.7 ȺſkM�JNDVI kr˓~œǺqƆ˗�ĩW~_UIŃŎqƆ˗IydIǧȟSŤoɡȟ]jN�ŋįo
ƪǭðS˞ķȗƜongj]yNnmqDzȑSM�JùɴȲȽkȠNd GIMMS q£
Å�� ¤or NDVI SƮȠ[�jN�J ���NDVI qYqDzȑ�ƶɯ_�d|oI˞ķ]oUNǧȟƪǭ EVI(Enhanced
Vegetation Index)�SˁȨ[�dJEVI rÝÊqƁ�2)kǼ|���J�
EVI =G * NIR− REDNIR+C1 *RED−C2 *BLUE + L �
� � � � � � �(2)
�YYkIBLUE r˔ɰĪɿúɜňkqĦŨșIC1 rʙɰĪɿúɜňkqœǺɸǴ«¾
·Å�IC2 r˔ɰĪɿúɜňkqœǺɸǴ«¾·Å�IL rǧȕɡɲo˃_�ɸǴ«
¾·Å�IG r EVI ðqɸǴ«¾·Å�kM�JĪɿú~ʣʙőɜnm�ʁȉ_��
Ã�r˓oɼ���ŋįrŅɶ˖�Ǵ]UʁȉkTnNqkIbqʶĆqǧȟƪǭSƋ
��nNJbYkIEVI rǍ�nmq©���ˇģ_�d|I̝ ɰŹň�çȠ]jN�J
ǧȕqŤnlY�kqƛſSˠNIœǺqƆ˗SɸǴ[�jN�qkI��Ǵȳnðk
M�lɤP���JǗȲȽkçȠ[�jN� MODIS q°Â��¤l]jr NDVI l
EVI qÍǀSȟƞ[�jN�J�
nQǗȲȽkçȠ[�jN� MODIS q°Â��¤l]jr NDVI l EVI qÍǀSȟ
ƞ[�jN�J
2.2 MODIS �
� � MODIS (Moderate Resolution Imaging Spectroradiometer�ÏʂöſȣöƷŨʄ)
rNASAŅțʁȉɵLJAquaQ�tTerrao�Ʋʟ[�jN��hq�Ã�q˧hkI
ĪɿʙőňqƷŨʄkM�JɵLJqƣfÉXrb�c� 1999 ŽˣTerra Iʕ2002 Ž
ˣAquaˤkM�JʁȉȀʿr 0.4L1.4µm kI36 �¹Ã¨ÀqʁȉȀʿŹ�ƩfI
Ņțüã˧ǂ�RWjʁȉ_�JœǺIŅɶIȄȁo˃_� 34 Ȼ˝q°Â��¤S
ýˁ[�jQ�IbqȾ˂ʂöſr�1000m, 500m, 250m kM�JnRk}ǧȟƪ
ǭq£Å�rIAMAZON AWS kq¬ �£Å�ʂǞ°Â���¤q¢�¤² ¥£
Å�l]jqưéSŗygjQ�%�&IǦǟßƮ~ǧȟŏĖʁȉnmqĸ˜uqȂȠ
SǕƈ[�jN�JMODIS q£Å�r NASA q LP DAAC(Land Processes Distributed Active Archive Center) [8]o��ưé[�jQ�I��¤R�Ɛɻn®
��À���ÃÂÅ¥kT��OongjN�J�
�
Ŀ 2.� MODIS �Ʋʟ_�ŅțʁȉɵLJ Aqua(ŵ)l Terra(Ĭ) [9]
�
��Ŀ 3 oǗȲȽkçȠ]d MODIS q EVI £Å�qÆè�ȴ_JǯʝSNJ˂x � Ž Iʕ
ɞʝSǧȟƪǭqðkM�JʁȉĵǕr �� ǂkM�d| � ȑSÆŽ�ɶ]Iǧȟƪǭ
SĜŽQToÉDžIÊ˅�U�RP_lNOȖƎ�Ț_JydIɵLJ£Å�qȖƎl]
jIDzɲ~©��SŒƻijy�jN�ǬŘSzjl��J�
�
Ŀ 3�� ��" qǧȟƪǭq£Å�qè�
�
Ŀ 4 r MODIS kĨƋ[�dŚɉǸq EVI qȣökM�JˡŃɰS EVI ðqáNI
ɛS EVI ðqˠNŅȑ�ȴ]jN�JĿ 4 R�ĆR��OoIǧȟƪǭqĆŸrŚɉ
ll}odž�RoŏĖ]jN�JydȾ˂ȫo}ŏĖSM�YlS�R�JYqŏĕR
�bqŅȑořń]jN�ǧȟ�ƯŠ]d�IydIʿ Ǖ˂o�dgjʁȉ_�Ylo
��ĮŅȑkqǧȟĆŸSmq�OoŏĖ]jNUR�ʒv�ÔSkT�lǕƈ[�
�J
� (a) 2009/3/6 (b) 2009/6/10
� � � �
� (c) 2009/9/14 � � � (d) 2009/12/3 � � � Ŀ 4. ĮŚɉ MODIS kƴgdǧȟƪǭqè[8]
2.3 ǗȲȽkŧʖl_�£Å�
� � ǗȲȽkrŅțʁȉɵLJ Terra oƲʟ[�jN� MODIS �Ã�q EVI °Â��
¤�ȠN�JYq°Â��¤r 2000 Ž 2 ǒÝ˅ĆohNjINASS LP DAAC[8]R�ýˁ[�jN�J°Â��¤qʌɒrÝÊoȴ_J
ɶ 1.°Â��¤qʌɒ
˙Ȭ � � � � � � � � � Āţ
ȣö��� 4800<4800 , 2400<2400 , 1200<1200(pixel)
�ð¿Ã�˂ˋ 1 Ǡ/16 ǂ
ʂöſ 250m/pixelI500m/pixelI1km/pixel
ɖɝſɊľ
ɖɝſ� 10 ſZlo��ÀĖ(èPs˯ɖſ 130;~140;, ɝſ 30;~40;) � (Ŀ 5 ĤȒ)
ĨƋŽƻ 2000 Ž 2 ǒ˰Țń
ŅĿƥƆ �Ã�ÃĿǿ� (Ŀ 5 ĤȒ)
£Å�ƅƁ HDFEEOS
� � � � ��� �Ã�ÃĿǿˣĿ 5ˤkrÏŕŘěɜ�ÏƏo]jǯŻrɝſ@oŧ]jÏŕ
ŘěɜR�qʚˏS cos@oǹè_��OoƥƆ[��J�
�
� � � � Ŀ 5���Ã�ÃĿǿ[8]
�
×ŴɵLJoƲʟ[�d�Ã�rŒȻŒǬnȣö~£Å��ưé]jN�Jb���
˧h˧hª¾ª¾oưé_�Yl�}f��kT�SIdU[�qȻ˝q£Å��
b�c��ôŤ]jČȠ_�qrȓˎnåǩln�JydI£Å��ŏƱ_�
l£Å�qǫʩ}ŏĖ_�qkɹˎon�JYq�OnȗǾkINASA rǬKn®
�ŵ ¤q£Å��ɘÆȫoƤOd|IHDF[10]�ˁȨ]dJHDF(4)rȧn�
µ�ÃkqǤɑlɈȜqd|qµÀ��¯���¤®��ÀƅƁlbq¾�¯¾
¿kM�IHDF Group o��ˁȨ[�dJ � ɶ 2 o HDF kåƞ[�jN� MODIS q°Â��¤qÆʀ�ȴ_JȾ˂ʂöſ 250m qǧȟƪǭr�MOD13Q1.A20011001.h29v05.005.2008270101745.hdf lN
O®��Àoijy�jN�JYYk MOD13Q1 r£Å�qȻ˝IA20011001 r£
Å�ĨƋǂq�»¿�ÃǂÛɶʆˣA YYYY DDDD Iʕh29v05 r��ÀʕċŘI
005 rªÅ�½ÃƙŊI2008270101745 r£Å�åƞǂÛkM�J��ÀʕċŘ
rŅțqüňˣɝɖſˤ� 10 ſZloĆđ]IĮ�¿�oŧ]j��ÀʕċŘ�
đ�ƄjjQ�IĿ 5 q�Oo�Ã�ÃĿǿkƥƆ[�dŅț� h l v kǭʆ]
jN�JǗȲȽkrĻŁ�ij{ h29v05 q��À�ʁť]jNUJnQIYq°Â
��¤or NDVI, EVI nmqÁ�ºÅSijy�jN�J�
Ŀ 6 or h29v05 q EVI qÁ�ºÅqIȣöl]jqɶȴè�ȴ_JHDF qɶȴ
or HDFviewer[11]�çȠ]jN�J
ɶ 2. MODIS q°Â��¤qÆʀ 1/3 [8] Short Name Platform MODIS Data Product Raster
type
Res (m) Temporal
Granularity
MCD12C1 Combined Land Cover Type CMG 5600m Yearly
MCD12Q1 Combined Land Cover Type Tile 500m Yearly
MCD12Q2 Combined Land Cover Dynamics Tile 500m Yearly
MCD15A2 Combined Leaf Area Index - FPAR Tile 1000m 8 day
MCD15A3 Combined Leaf Area Index - FPAR Tile 1000m 4 day
MCD43A1 Combined BRDF-Albedo Model Parameters Tile 500m 16 day
MCD43A2 Combined BRDF-Albedo Quality Tile 500m 16 day
MCD43A3 Combined Albedo Tile 500m 16 day
MCD43A4 Combined Nadir BRDF-Adjusted Reflectance Tile 500m 16 day
MCD43B1 Combined BRDF-Albedo Model Parameters Tile 1000m 16 day
MCD43B2 Combined BRDF-Albedo Quality Tile 1000m 16 day
MCD43B3 Combined Albedo Tile 1000m 16 day
MCD43B4 Combined Nadir BRDF-Adjusted Reflectance Tile 1000m 16 day
MCD43C1 Combined BRDF-Albedo Model Parameters CMG 5600m 16 day
MCD43C2 Combined BRDF-Albedo Snow-free Quality CMG 5600m 16 day
MCD43C3 Combined Albedo CMG 5600m 16 day
MCD43C4 Combined Nadir BRDF-Adjusted Reflectance CMG 5600m 16 day
MCD45A1 Combined Thermal Anomalies & Fire Tile 500m Monthly
MOD09A1 Terra Surface Reflectance Bands 1–7 Tile 500m 8 day
MOD09CMG Terra Surface Reflectance Bands 1–7 CMG 5600m Daily
MOD09GA Terra Surface Reflectance Bands 1–7 Tile 500/1000m Daily
MOD09GQ Terra Surface Reflectance Bands 1–2 Tile 250m Daily
MOD09Q1 Terra Surface Reflectance Bands 1–2 Tile 250m 8 day
MOD11A1 Terra Land Surface Temperature & Emissivity Tile 1000m Daily
MOD11A2 Terra Land Surface Temperature & Emissivity Tile 1000m 8 day
MOD11B1 Terra Land Surface Temperature & Emissivity Tile 5600m Daily
MOD11C1 Terra Land Surface Temperature & Emissivity CMG 5600m Daily
MOD11C2 Terra Land Surface Temperature & Emissivity CMG 5600m 8 day
MOD11C3 Terra Land Surface Temperature & Emissivity CMG 5600m Monthly
MOD11_L2 Terra Land Surface Temperature & Emissivity Swath 1000m 5 min
MOD13A1 Terra Vegetation Indices Tile 500m 16 day
MOD13A2 Terra Vegetation Indices Tile 1000m 16 day
ɶ 2. MODIS q°Â��¤qÆʀ 2/3 [8] Short Name Platform MODIS Data Product Raster
type
Res (m) Temporal
Granularity
MOD13A3 Terra Vegetation Indices Tile 1000m Monthly
MOD13C1 Terra Vegetation Indices CMG 5600m 16 day
MOD13C2 Terra Vegetation Indices CMG 5600m Monthly
MOD13Q1 Terra Vegetation Indices Tile 250m 16 day
MOD14 Terra Thermal Anomalies & Fire Swath 1000m 5 min
MOD14A1 Terra Thermal Anomalies & Fire Tile 1000m Daily
MOD14A2 Terra Thermal Anomalies & Fire Tile 1000m 8 day
MOD15A2 Terra Leaf Area Index - FPAR Tile 1000m 8 day
MOD17A2 Terra Gross Primary Productivity Tile 1000m 8 day
MOD17A3 Terra Net Primary Productivity Tile 1000m Yearly
MOD44A Terra Vegetation Continuous Cover Tile 250m 96 day
MOD44B Terra Vegetation Continuous Fields Tile 250m Yearly
MOD44W Terra Land Water Mask Derived Tile 250m None
MYD09A1 Aqua Surface Reflectance Bands 1–7 Tile 500m 8 day
MYD09CMG Aqua Surface Reflectance Bands 1–7 CMG 5600m Daily
MYD09GA Aqua Surface Reflectance Bands 1–7 Tile 500/1000m Daily
MYD09GQ Aqua Surface Reflectance Bands 1–2 Tile 250m Daily
MYD09Q1 Aqua Surface Reflectance Bands 1–2 Tile 250m 8 day
MYD11A1 Aqua Land Surface Temperature & Emissivity Tile 1000m Daily
MYD11A2 Aqua Land Surface Temperature & Emissivity Tile 1000m 8 day
MYD11B1 Aqua Land Surface Temperature & Emissivity Tile 5600m Daily
MYD11C1 Aqua Land Surface Temperature & Emissivity CMG 5600m Daily
MYD11C2 Aqua Land Surface Temperature & Emissivity CMG 5600m 8 day
MYD11C3 Aqua Land Surface Temperature & Emissivity CMG 5600m Monthly
MYD11_L2 Aqua Land Surface Temperature & Emissivity Swath 1000m 5 min
MYD13A1 Aqua Vegetation Indices Tile 500m 16 day
MYD13A2 Aqua Vegetation Indices Tile 1000m 16 day
MYD13A3 Aqua Vegetation Indices Tile 1000m Monthly
MYD13C1 Aqua Vegetation Indices CMG 5600m 16 day
MYD13C2 Aqua Vegetation Indices CMG 5600m Monthly
MYD13Q1 Aqua Vegetation Indices Tile 250m 16 day
MYD14 Aqua Thermal Anomalies & Fire Swath 1000m 5 min
ɶ 2. MODIS q°Â��¤qÆʀ 3/3 [8]�Short Name Platform MODIS Data Product Raster
type
Res (m) Temporal
Granularity
MYD14A1 Aqua Thermal Anomalies & Fire Tile 1000m Daily
MYD14A2 Aqua Thermal Anomalies & Fire Tile 1000m 8 day
MYD15A2 Aqua Leaf Area Index - FPAR Tile 1000m 8 day
MYD17A2 Aqua Gross Primary Productivity Tile 1000m 8 day
�
Ŀ 6. HDF Viewer kqɶȴè
3. ¸£¿Ã�Ƣǿqʐdž 3.1 ǧȟƪǭq¸£¿Ã� 3.1.1 ɘʄśȫn¸£¿Ã�ohNjqǨʅ � ǗȲȽkrǧȟƪǭŏĕqɘʄśȫ¸£¿Ã�Ƣǿ ML(ǑŬ)ǿlMAPǿ�ȠNjI
��Ⱦ˂ʂöſqˠN MODIS EVI qĻŁq£Å�q¸£¿Ã�oʱȠ_�JǗɁk
rbqƢǿohNjùɴȲȽoʼniNjyl|�J
3.1.2 çȠ_�Â��¢� �˃ƻqɓÙ � ˧ŽĆqǧȟƪǭqÿŇȫnŏĖ�Ŀ 7 oȴ_JǧȟqŏĖrIljR�ŐqʲȹlI
ȶR�ĂqʲȹR�ǫƞ[��d|IY��ĘĆĖ[�dÕhq˃ƻo�gjŠɢ_�J
ˣYqŋįIŚɉŏĕqǓ�ǧȕSřń]jN�Yl�Đưl]jQ�IȔŹˑǟ~ź
ɛʽɳǰnmrY�orʋƄ]nN}ql_�ˤ
Ŀ 7.ǧȟƪǭq¸£¿Ã�
� Yq˨hq˃ƻ�IÝÊq˨hqÂ��¢� �˃ƻkɶ_J�
�
� ��������(3)�F tk |θ( ) =fi,1 tk |θi, j( ), tbi−1 ≤ tk < ttifi,2 tk |θi, j( ), tti ≤ tk < tbi
#$%
&%
VI
21
t
�
�
� ��� � � � (4)�
�
��������������(5)
�
� � YYkI, rŽƻI- r 1 SljR�ŐI�2 SȶR�Ă�ɶ_}ql_�JydI�
�r
l qÖȑl]I r l qÖȑl_�JydI
rĮĘ˂qÂ��¢� �˃ƻq
¸£À«¾·Å�l_�J�
�
�� � YqĮĘ˂ � hq¸£À«¾·Å�rĮĘ˂qŏĕo˃_�þãȫnƚĶ�Ʃg
jN�JĿ ���ɶ obqƚĶ�ȴ]jQUJ�
�
�
�
��
�
�
�
� � � � � Ŀ 8.«¾·Å�qʐdž
ɶ 3.«¾·Å�qƚĶ
( ) ( ) 1,1,1,
1,1,1, exp1
| ikii
iiki d
tbac
tf +++
=θ
( ) ( ) 2,2,2,
2,2,2, exp1
| ikii
iiki d
tbac
tf +++
=θ
itb fi−1,2 1,if itt 1,if 2,if
θi, j = ai, j,bi, j,ci, j,di, j | i =1,2,,n j =1,2{ }
«¾·Å�� ƚĶ�
a1,a2 ŐIĂqɀfÉS�ˣÊS�ˤNJǕq`��
b1,b2 ɀfÉS��ÊS��qƔų[�
c1+d1
d2
b1
b2
a1 a2 t
evi
d1
c2+d2
��YYkIĮĘ˂qǑœðydrǑŪð�Æɬ[a�Ylo��I£Å��ȍ�Roɟ
VYlSkT�J�
� � �������� � � � (6)�
� ����� ���� �
� ������������ � � � (7)�
[�oIŒŽ£Å�uqʱȠqd|I6 l 7 �ŏƅ[aj o˃_�ȏĖƁ�Ǽ|�J�
� � ��� � ��� � � � � (8)�
� � ������������������������ (9)
�
� YqÕhqȏĖƁ�Ɓ�4)IƁ�5)uÜû_�Ylo��ŒŽ£Å�oƧƂ_�Jx d
e]I�Ɂqš˟krIʂqɪȢſ�ōē[aj®� ¢Ã�qɎſ�MX�d|Ɓ�����
Ɓ���qčɐrČȠ]jNnNˤ�
� �gj qŋįr�
�
� � � � � �����(10)�
�
�����������(11)�
qŋįr�
1,2,1, iii cdd +=
1,11,12, ++ += iii dcd
jid ,
1,12,1,1 ++ −= iii cdd
2,11,12,2,1 +++ −−= iiii ccdd
mk ttt <
( ) ( ) 2,12,11,11,1
1,11,1 exp1
| cdtba
ctf
kk ++
++=θ
( ) ( ) ( ) 1,
1
22,1,2,1
1,1,
1,1, exp1
| i
i
mmm
kii
iki cccd
tbac
tf −+−+++
= ∑−
=
θ
mk ttt ≥
c1+d1,d1 ljR�ŐqǑŪIǑœð�
c2+d2,d2 ȶR�ĂqǑœIǑŪð�
�
���������������������(12)�
�������
(13)�
ln�J�
�
�
�
�
3.1.3 ML ǿo��«¾·Å�qƯŠ ML ǿ�ǑŬǿ���'9,070�/,.*/,+22)�*56,0'6,21��lrŬſSǑœon�¸£À«¾
·Å� > �Ǽ|�ƯŠƢǿkM�JM�¸£À«¾·Å� > SÌP��dNJI£Å�
� ¤ �ʁȉ_�ȳșr�
�
� � � � � � � � � �� � ������
ln�JYYk rM�£Å�ȑ�ʁȉ_�ȳșI r£Å�qíƻln�Jʁ
ȉðr¸£Àqĵ�oĆƺ?�qǴɾĆŸkřń_�l rÝÊqʧ�ln�J�
�� ����������� ���������������
�¸£ÀqŬ}�][�ȴ_ƪǭIŬſ rÝÊqƁkɶ[��J�
�
� � � � � � � � � �����
Ɓ����oƁˣ��ˤ�Üû_�lrÝÊqƁon�J�
��
�
� � � � � � �����
� rƯŠŽƻkM�JYYkŬſqŧƻ�ŧƻŬſ��l�Ylo��ÑɇR�ēɇ
qʄɇoŏƱ_�YlSąǛ�Jbqd|IÆɮoŧƻŬſ�ǑœĖ_�¸£À«¾·
( ) ( ) 2,12,12,1
2,12,1 exp1
| dtba
ctf
kk +
++=θ
( ) ( ) ( ) 2,1,
1
22,1,2,1
2,2,
2,2, exp1
| ii
i
mmm
kii
iki ccccd
tbac
tf −−+−+++
= ∑−
=
θ
{ }nxxxX ,,, 21 =
P(X |θ ) = P(xi |θ )i=1
n
∏
€
P(xi |θ)
€
n
€
P(xi |θ)
€
P(xi |θ) = N(F(ti |θ),σ2)
€
l(θ)
∏=
==n
iixPXPl
1
)|()|()( θθθ
∏∏∏= = =
=M
i j
n
kijikji tfNl
1
2
1 1
2,, )),|(()( σθθ
€
M
Å� > �Ǽ|�lNgdǀǿSŒUl���J�
���������������������������� �� � � � � � � � � � � �����
ɗŭIǑʱn¸£À«¾·Å�rŧƻŬſ�ǑœĖ_�Ylo�gjǼ|���J�
��������������������������� �� � � � � � � � � � � ������
Ɓ����krIİű����oʼniTIĦƌo��ʦdzȫoʂ�Ǽ|jNU§»Å¤Ãǿ�
ȠNjǑʱʂ�Ǽ|�J¸£À«¾·Å�qǎƿƁrƁ���kɶ_J�
�������������������������� �
� � � � � � � � � � � ����
� YYk rǎƿƉq¸£À«¾·Å�I rǎƿĐq¸£À«¾·Å�I rìǴ
ʼʒƼŠƻkM�JÆdzƍĆ rʒƼ_�ǀIJIÕdzƍĆ rìǴʼ�ȴ]j
N�J̆ ȑ�ǑœðydrǑŪð�oʣiUoh�jìǴʼr 0 oʣiUd|IìǴʼS
M�ÆŠƻ��Ū[UngdNJ�ħǙ]dlĊƾ_�YlSąǛ�J�
� � � � � � � � � � � �� �
� � � � � � � � � � � � � (21)�
������������������� � �� �� � ��=SŠʼ�� � � � � � � � (22)�
YYk rǎƿƉqŧƻŬſI rǎƿĐqŧƻŬſIM rƯŠŽƻ�ɶ_JydI
ıNJoǑœʊɴļƻ q�Ɓ�23)o��ʇŠ]jQUJ�
������������������������� �� � � � � � � � � � � � � � (23)
ÝÊrǑŬǿo��«¾·Å�ƯŠq�À�¿�¶kM�J�
˧˦� ¸£À«¾·Å� θqĉǕð�ʇŠ�
˨˦� b�oʼniTĉǕ¸£À�ȟƞ]jŧƻŬſ �ɇą�
˩˦� ˄ð RhĦƌļƻ �Ȋd_n�ÝÊ�Ħƌ�
˪˦� ����˨˃ƻqÖȑ �ɇą�
˫˦� ����ÆdzƍĆIÕdzƍĆqɇą�
ˬ˦� ����Ɓ�20)o�� θ�ǎƿ�
˭˦� ����ƿ θ�ȠNjƿdoŧƻŬſ�ɇą�
ˮ˦� ���Ɓ(21)o�� �ǎƿ]˥ l]j oƠ��
)()(log θθ Ll =
)(maxargˆ θθ θ L=
)()(ˆˆ
0 θθ
θθLL
n ""
"Δ+=
nθ̂ 0θ̂
€
Δ
)(θL" )(θL ""
ε<−
=n
n
LLL
dl 0
M
510−=ε
nL 0L
Mq ×= 510
)(θL
ε<dl qk <
tttb,
dl 1=+k
3.1.4 MAP ǿo��«¾·Å�qƯŠ
MAP ǿ(Maximum a posteriori)lrÔƉȳșSǑœln�¸£À«¾·Å��Ǽ|
�ƢǿkM�JÔƉȳș(posterior probability)lrM�£Å� SÌP��jN�NJ
o¸£ÀS«¾·Å�� ¤ �ƩhlNOǚÞÛȳșkM�J²��qŠȜ�ȠN�
Ylo��IÊʆqƁSn�dhJ�
�
� � � � � � ������ � (24)�
� r«¾·Å�� ¤ qȳșŤſĆŸkM�JYYkɌğqd| �ǴɾĆ
Ÿ˃ƻkɶ_}ql_�J�
�
� � � � � ������ �(25)�
�YYk r«¾·Å� θqżņðkI r«¾·Å� θqĆƺl_�J�
Ɓ�24)qĆǷ P(D)rŠƻkƁ����qœŪoƆ˗�ÌPnNqkIƁ�24)�Ɓˣ��ˤ
oŏǎkT�J�
�� � � ���������������� � �(26)�
���Ɓ 26 o�gjIǼ|�vTƁrÊʆqʧ�on�J�
€
X
€
θ
)()()|()|(
DPPXPXP θθ
θ =
€
P(θ)
€
θ
€
P(θ)
∏∏= =
=M
i jjiji
NP1
2
1
),()(,, θθ τµθ
ji ,θµ
ji ,θτ
)()|()( θθθ PXPs =
�� � � � � � � ��� � � � �(27)�
� YYk ML ıǬoŧƻ�lgjʄɇ�ɌȥĖ[a�J�
�� � � �������� ��(28)�
��Ɓ�28)�ƼȜ_�lI�
�� � � � � � ������ � � ���(29)�
ln�J�
Ǒʱʂ�Ǽ|�orIMAP ǿk}ǑŬǿlıǬo§»Å¤Ãǿ�ȠN�JǎƿƁrƁ (30)ln�J�
�
� � � � � � � �� � ���(30)�
��ǑŬǿlı^UI rǎƿƉq¸£À«¾·Å�I rǎƿĐq¸£À«¾·Å�I
rìǴʼqʒƼŠƻ�ɶ_JÆdzƍĆ rʒƼ_�ǀIJIÕdzƍĆ rìǴ
ʼ�ȴ]jN�JħǙĊŠrÝÊqƁkɴOJ�
�
� � �εSŠʼ�� � ���������� ���
(31)��
� MAP ǿo��«¾·Å�ƯŠq�À�¿�¶�ÝÊoyl|jQUJ�
˧˦� ¸£À«¾·Å� qĉǕðqʇŠ�
˨˦� b�oʼniTĉǕ¸£À�ȟƞ]j S(θ)�ɇą�
˩˦� ˄ð RhĦƌļƻ �Ȋd_n�ÝÊ�Ħƌ�
˪˦� ˨˃ƻqÖȑ ˥ �ɇą�
˫˦� ÆdzƍĆ˥ÕdzƍĆqɇą�
ˬ˦� Ɓ�29)o�� �ǎƿ�
˭˦� ƿ θ�ȠN˥ƿdo S(θ)�ɇą�
ˮ˦� Ɓ�30)o�� �ǎƿ]˥.��� l]j uƠ��
)(maxargˆ θθ θ s=
{ })()|(log)()(log θθθθ PXPSs ==
)(maxargˆ θθ θ S=
)()(ˆˆ
0 θθ
θθSS
n ""
"Δ+=
nθ̂ 0θ̂
€
Δ !S (θ ) !!S (θ )
ε<−
=n
n
SSS
ds 0
θ
ε<ds qk <
tb tt
θ
ds
3.1.5 �¾�¥��Ã¥�ǀƁ
İű(2008)[2]r 10 ŽÝÉqŒƻŽqǧȟƪǭkq¸£¿Ã��šǁ_�orIŽƻ
qōēll}oǹè]jǼ|�vT«¾·Å�SōPI[�oħǙoɻ_�ĦƌļƻS
ōœ]jNUYlkʄɇʼS O(n2)ongj]yOYl�ƪƳ]dJY�oŧ]Iİű
(2008)rʄɇNJ˂�ďȆ_�d|I�¾�¥��Ã¥�ǀƁ�ưǥ]dJYYkrƯŠ
ŽƻɊľ�ŀŠ]I�¾�¥[ajNUYlo��I��Ã¥�Ǹo¸£À«¾·Å�
ƯŠ�ɴOƢǿ�ưǥ]dJydI��Ã¥�ÍɂqŽŌȤkr˓~ȼ˒o��Dzɲ£
Å�SŒUijy�j®� ¢Ã�Sː]Nd|IĮ�¾�¥��Ã¥�or � Žqʺɹ
�ƩdadJ̧ £À«¾·Å�ƯŠƉI̋ ɹ]dĘ˂q«¾·Å��ďˇ]j��Ã¥
�˂khnX�Ylkʂqʪɚƕ}ëƩkT�Jš˟R�IY��qŧɅo�gjIɎ
ſlʂqʪɚƕ�ëgdyyIʄɇʼ� O(n)oďȆkT�Yl�ȳʍ]dJnQIǗ
ȲȽk}İű����qƢǿ�ƮȠ]IÆļʄɇ�ƯŠ_�ǑœŽƻr 5 ŽI�Ūž
°r 2 Žl_�Ylo_�J
1� � 2� � 3� � 4� � 5� � 6� � 7� � 8� � 9� � 10� � 11� � 12� � 13
Ŀ 9.�¾�¥ǀƁqƯŠɊľqɽĨ�Ŀ
4 š˟ 4.1 Óôš˟ � ǗɁkry` C ʃʎkšɷ]d°Â�¾¶�ȠNjÓôš˟�ɴOJbqˌI«¾
·Å�ǽŠǀǿohNjr 3 kʥvdÝÊq 2 Ȼ˝�ȠNdJ Ä�ML ǿ Ä�MAP ǿ
ydIb�c�ohNj[�oŌȤ~«¾·Å�ǽŠ˛ňqčɐo˃]jÝÊq˩Ȼ˝
�Ǩʅ]dJ Ä�ŀŠŽŌȤ,ü˛ňıNJǽŠ
Ä�ɪĕŽŌȤ˥ü˛ňıNJǽŠ Ä��¾�¥��Ã¥�ǿ(3.1.5)
�gjįʄˬȻ˝qƢǿ�Ǩʅ_�J
� š˟oçȠ]dƢǿ�ɶ 3 oyl|�JnQIÝÊkrIɌğqd|I«¾·Å�Ư
Š�À�¿�¶l��·Ã¤ŌȤǿ�ɕzį�ajIèPs� MAP ǿqɪĕŌȤʇŠ
ǿr MAP-free q�Ooȥʆ_�}ql_�J
ɶ 3˦çȠ]dƢǿqÆʀ
�Å� Ž��·Ã¤ŌȤ «¾·Å�ƯŠ
q�À�¿�¶
1 ŀŠ(fix) MAP
2 ɪĕǽŠ(free) MAP
3 ɪĕǽŠ(slide) MAP
4 ŀŠ(fix) ML
5 ɪĕǽŠ(free) ML
6 ɪĕǽŠ(slide) ML
ydIb�c�qš˟krÝÊq˙Ȭ�ʉê_�}ql]dJ
Ä�ʊɴļƻqħǙuqƆ˗ Ä�ǑɯʂqżņʏŶ Ä�ĮƢǿkqǑɯʂqʄɇNJ˂
Y��qÓôš˟qɗǡ�ĦLj]jIdzɉq Hadoop kqš˟ǚÞ�Ǽ|�Yll_�J
4.1.1 š˟£Å�lš˟ǚÞ š˟kr MODIS q EVI £Å��ČȠ_�JȾ˂ʂöſrǑœʂöſq
250m/pixel(4800<4800pixel)q£Å��çgdJɶ 4 oçȠ]d£Å�qʌɒ�ȴ_J
ydIYqÏR�˩ȑ��ð¿Ã�]jš˟oȠNdJ
ɶ 4.� š˟ŧʖl_�ĻŁ£Å�ʌɒ
˙Ȭ ð
ȣö��� 4800<4800(pixel)
˧ȑq£Å�ʼ 16bit
®�ŵ ¤ HDF-EOS
£Å�ĨƋNJ˂ 1 Ǡ/16 ǂ
ʂöſ 250m
ɖɝſɊľ ɖſ 130;~140;, ɝſ 30;~40;
ĨƋŽƻ 2001 Ž~2013 Ž(13 Ž˂)
ɶ 5 orš˟oȠNd�ð¿Ã�ȑqåǩlIEVI qǑœIǑŪð�ȴ_Jš˟
krIĻŁ�ij{��Àˣh29v05ˤʂöſ 250m q£Å�ohNj 2001 ŽL2013 Ž
q 13 Ž˂� 16 ǂ˂ˋk�ð¿Ã�]d£Å��çȠ]dJ
ɶ 5.� š˟£Å�qʌɒ
£Å�ƅƁr HDF kM�d|IHDF ˁȨ�ÀÅ°o�gjưé[�jN� HDF4¡ÅÀ�çȠ]j EVI Á�º�ª�¦¿£Å�l]jƦą_�°Â�¾¶�åƞ]I
[�oY��œʼqȣöoŧ]jÆƨĄȜ_���À°Â�¾¶�åƞ]d(Ûʾ A.1)J
� š˟rɶ 4L5 qĻŁq£Å�(MODIS EVI)�ŧʖl]jI«¾·Å�qĉǕð�ŏ
PnS�b�c�ʊɴļƻ� 10 ļI20 ļo]jʊɴ�ɴgdJǑ}ŬſydrÔƉȳ
șSǑœlngdʂ�Ǒʱʂl]jN�J
ydI�¾�¥��Ã¥ǀƁqŋįr 2 ŽĆq£Å��ʺpj 5 ŽZlqɊľkb�
c�ʊɴļƻ� 10 ļI20 ļkšǁ]jIǑœln�ʂ�bqƯŠɊľkqʂl]jN
�J
� ʉêqƪǭkM�ʏŶohNjrÝÊqƁkǼ|dJ
! = (!! − !! !!)!!!!!
!
YYkIE rʏŶșkIy r£Å�If ˃ƻl_�J
4.1.2 š˟ɗǡ
ID ID ƀǭ EVI qð
max min max-min
F (88,2876) 7538 1634 5904
G (616,2360) 5860 255 5605
H (1320,2840) 9895 1865 8030
� Ŀ 9 oȧn�ʊɴļƻ(10 ļl 20 ļ)kǼ|��dǑɯʂx ¸£À lʕ£Å�qżņʏ
Ŷ�yl|�JMAPAfix Sijy�jNnNSIY�r MAP-fix o˃]jrmq�Å�}
ŝŠoʂ�Ǽ|�YlSkTnRgdd|kM�Jb�ÝőqƢǿkrI10 ļĦƌl 20ļĦƌq˂krwl�mʂqɎſoŶSnNYlS�R�JydIǑ}żņʏŶSŪ[
Rgdqr ML-slide ydr MAP-slide kMgdJ
Ŀ 9˦ĮƢǿkǑɯʂl¸£ÀlqżņʏŶș (šɜĦƌļƻ 10 ļ,ȑɜrĦƌļƻ 20 ļ)
� Ŀ 10 oIĮƢǿkǑɯqʂS}l|��dʊɴqʄɇNJ˂ˣħǙoɻ]dĦƌƻo
ǹèˤ�ȴ_JĿ 9 krmq�Å�k}ʊɴļƻ 10 ļI20 ļq˂kʄɇqɎſrwl
�mŏ��nRgd}qqIšˌor£Å�lʊɴļƻo�gjIħǙoƐɻl]dNJ
˂NJ˂rRn�ʰOYlS�R�JȖo MAP-free ǿoQW�s�hTSǑ}Ȑ]NJ
ÆǀI�¾�¥ǀƁ�ȠNd MAP-slide, ML-slide krʄɇNJ˂SǹʞȫȯUIyds
�hTSūnUŝŠ]jN�YlS�R�JY�rÆſqʄɇğàkǼ|���«¾·
Å�ƻSȆū_�Ylo�gjŝŠȫoʂSħǙ_��OongdYl�ȴ]jN�J
ƊgjIYqŝŠƕ�ʺɿ_�lIMAP, ML qN`�ohNj}Iʄɇļƻqs�h
T~ËŝŠ[�ļʴqʁȑR�}I�¾�¥��Ã¥�ǀƁ�çȠ_�qSǔy]Nl
NP�J
Ŀ 10˦ĮƢǿkǑɯʂqʄɇNJ˂
(šɜĦƌļƻ 10 ļ,ȑɜrĦƌļƻ 20 ļ)
MAP ǿq ML ǿoŧ_�÷àƕohNjrIYq�`Rn£Å�R�rĊċkTn
RgdSIɘʄśȫoĢŤor MAP ǿSʂUvTĸ˜�ɯUɶȚ]jN�YllI
ML ǿoǹvjʄɇNJ˂SœTUōē]jNnNÔR�IdzɉÝ˅qʄɇkrI
MAP-slide �ǭȋqʄɇƢǿl]jȠN�Ylo_�J
4.2 Hadoop o��ĆƺĄȜš˟ � ǗɉkrI4.1 qÓôš˟�ĤȒ]nS�IɺĐ(2014)o��NJɏĈȣöR�qNJɏ
ĈƦąl¸£¿Ã�q Hadoop, MapReduce uqšɷɗǡ�ȠNjI ĻŁq MODIS EVI £Å�R� 400 ȑ��ð¿Ã�]j¸£¿Ã��ɴNIbqȾ˂ĆŸINJ˂ŏ
Ė�ǹʞǨʈ_�}ql_�J
4.2.1 NJɏĈqƦą 4.2.1.1 Ƣǿ � ǧȟƪǭqNJɏĈƦąohNjrIɺĐ(2014)[5]q°Â�¾¶�ČȠ]dJYYkr
Hadoop Ék MapReduce �ȠNjœʼqNJɏĈ£Å�qĆƺĄȜ�Ĕș�UšǁkT
��OnšɷSɴ��jN�J
� ɺĐ(2014)qƢǿkry`NJɏĈƦąĄȜkNJɏĈȣö�ûĒ]IûĒȣör MapĄȜk key �ȣöq¯Â � IDIvalue �ƀǭINJ˂I¯Â �Ė[�dȣöˣǧȟ
ƪǭˤl]jąĒ_�JShuffle kr£Å�r¯Â � ID Ǹoyl|��jIReduce
ĄȜ�ɴOĮ©Å¥oȇ[��JReduce krûĒ[�d£Å�ŅȑǸqNJɏĈl]j
yl|Ikey �ƀǭIvalue �ǧȟƪǭqNJɏĈl]jąĒ_�JbqƉIċq°Â�
¾¶kINJɏĈğàk MAP-slide ǿkāt MapReduce k¸£¿Ã��ɴNI¸£À
«¾·Å�� value l]jąĒ_�J«¾·Å�qƯŠr�¾�¥��Ã¥�ǀƁ�Ƞ
NjIȿqɊľ� 5 ŽI�Ūž °� 2 Žl]jʄɇ]jNUJ°Â�¾¶qʌɒ
rÛʾ C l D oȴ_J
4.2.1.2 š˟ǚÞ
Hadoop o��š˟rIʋƄ_� MODIS °Â��¤q MOD13 q��Àˣ4800<4800 pixelˤR�ĻŁ�ij{Ɋľˣ1000×1000piexl, ƀǭˣ30,2600ˤR�(1029,3599)
qȮƅˤeWĨ�ą]jɴgdˣÛʾ A.2 JʕĿ 11 r��ÀR�qĻŁ�ij{˛ňqƦ
ąèkM�J
Ŀ 11.� š˟oȠNd��ÀlbqƦąȣöè(1000<1000 pixel) J�Ã�ÃĿǿqd
|ǯǀIJoǵzSM�J Ʀą˛ňrøq��À(4800x4800pixel)krˣ30,2600ˤR�
ˣ1029,3599ˤq˛ňoMd�J
ɶ 6 orš˟oçȠ]d£Å�ohNjyl|�J¸£¿Ã�qš˟krIYq£Å
�R�üȑohNjqNJɏĈƦą�ɴgdƉI50x50 ��ÀQToįʄ 400 ȑ��ð
¿Ã�]Ibq£Å�o MAP-slide ǿk¸£¿Ã��šǁ]dJbqˌIDzɲ~©��o
ŧ_�ŧɅl]jIĮNJ˂oQNjIM�£Å�ȑq 8 ʣó�ij|d£Å�ˍįR�·£
��Ã�ĨƋ]dJ
ɶ 6. š˟oȠNd£Å�
˙Ȭ� � � � � � � � ð�
ġ£Å�� � �� ������ q � ŽĆˣįʄ˯��� Ǡˤ�
��9��3,9*/ qȣö�
çȠɊľ� �30,2600)R�ˣ 1029,3599ˤ )q �<��
3,9*/���#� Á�ºÅ�
�ð¿Ã�˂ˋx Ⱦ˂ �ʕ ��<���3,9*/�
� Hadoop o��ĆƺĄȜrˠȭœśȜśʶƙŊȷśƹŢqƹɧȠš˟��¢¶qÆʶ
�çȠ]dJɶ 7 oš˟ȝŌ�ȴ_J
ɶ 7.� Hadoop o��ĆƺĄȜqš˟ȝŌ
˙Ȭ� ð�
DZȻˣµ��ÅI�Áůl} �ʕ ,�'(�����
°Â� �� �16*/�(24*�,�� ������:���� ���
·¸¿� ����
�Áů©Å¥īƻ� � ī�˧īMd� � ��qzçȠ��
"� �'(� "�$��������
�')223�8*45,21� ������
�'8'�8*45,21� �����
4.2.1.3 š˟ɗǡ
� ɶ 8 o Hadoop o��ĆƺĄȜoɻ]dNJ˂�ȴ_JHadoop q�Áů©Å¥l]
j 20 ī(b�c�˪��Ï˧��qz)�çȠ]dlY�Iʄɇoɻ]dNJ˂rI1 ĚÈ
ȑoŧ_�NJɏĈqƦąNJ˂oŧ]jr 1 Ć 13 ȸl˕źoˠʨkMgdSI¸£¿Ã
�o˃]jr 340 ȑoŧ]j 19 ĆȺſƐɻkMgdJɺĐ(2014)qɗǡ��I30 īȺ
ſykrmf�q°Â��}©Å¥ƻowxǹè]jʄɇʨſSDŽUn�lɤP��
�SI̧ £¿Ã�o˃]jrüȑ�ʄɇ_�qkM�sYq 2500 îRR�Ylon�I
800 NJ˂Ⱥſ�ɻ_�Ylongj]yOJY�rI¸£¿Ã�ʮȺoŒƻqĦƌʄɇ
Sijy��YloēPjIȚńrɗǡqʉêqd|o®��ÀuqǏTʢz�Œƻɴg
jN�R�lɤP���JšʭȠkrǏTʢz°Â���Ȇū[a�Ylo�gjˠʨ
Ė[a�ll}oIGPGPU nmqÎĈĄȜo��ˠʨĖ}æȠ]jNUYlSƐɻk
M�JǑʣ aparapi[17]nm Java k GPU �çȠ_�d|q API(Application interface)
}ȩŋ]j Hadoop ÉkČȠkT��OongjTjN�d|IØƉrIY��qČȠ
}Ǩʅ_�YlSǓĔlɤP���J
ɶ 8 � ĮĄȜqơɻNJ˂
ĄȜ� ơɻNJ˂� ôɤ�
NJɏĈƦą� �0�� ��5� �<�3,9*/ ȣö ��� Ǡ���
� Èȑqʿ[ ��� ȑqNJɏĈƦą�
¸£¿Ã�� ��0������5� Éʆ�� � ȑ�Ʀą]jšǁ�
ȄlĊƾ[��Ɇơršˌorʄɇ[
�nRgdqkIšʘ ��� ȑ�
� dzoIĬqĿo¸£¿Ã�ɗǡqÆè�ȴ_JʙNµÅ�S£Å�IɛqɜS¸£Àk
M�JYqɗǡR�IǪp£Å�qŽZlqŚɉŏĕ�ĦLj]d¸£¿Ã�ɗǡSǢʜo
ĨƋkTjN�lɤP���Jyd¸£¿Ã�SŖƸ]d�Å�r˧Þ}nRgdd|I
˚òn¸£¿Ã�ƢǿSǫɋkTjN�lNP�J
Ŀ 12. ¸£¿Ã�ɗǡqè
4.2.2 «¾·Å��¾��¿Ã� 4.2.2.1 Ƣǿ � 4.2.1 qĄȜo�gjĨ�ą[�d«¾·Å���¾��¿Ã�_�Ylo�gjI
ǧȟ�œyRo�ÀÅ°Ė]IbqȾ˂ĆŸ�ĪɿĖ_�Ylo_�J�¾��¿Ã�
orˊůĖ�¾��¿Ã�Ƣǿq˧ȻkM� xmeans ǿ�ȠNd[13]Jxmeans r
kmeansǿqƧƂkM�Jkmeansǿr¾Ã�¶oǽŠ[�dKíqʺƏR�ąȨ]jI
Ǒȯ�¾��uq£Å�qđ�ƄjI̋ ƏǽŠ�ħǙ_�ykɠ�ʤ_ǀǿkM�JÆ
ǀIxmeans rIǑĉoM�ęĆŪ[Nƻq�¾��o k-means ǿkĆ˝]dƉIĮ
�¾��oŧ]jıǬo k-means ǿo��˨Ćđ�IbqĆđSʱƄknNlĊƾ[
��ykɠ�ʤ_ǀǿkM�Jkmeans ǿlʰgj xmeans ǿorʱćn�¾��ƻ
SɪĕkǽŠkT�·¿ ¤SM�J � š˟krIǑʣq˪ŽĆq£Å�oŧ]jIĨƋ]dƀǭZlŽZlqŚɉŏĕq«
¾·Å�¿�¤{a1, b1, c1, d1, a2, b2, c2, d2} � 8 dzø²�¤Àlɽn]IˮdzøqŮ
ƕȾ˂oQW�¼Å�¿ ¥ʚˏoʼniU�¾��¿Ã��šǁ]dx Į«¾·Å�q
ƚĶohNjrĿ 8 ĤȒ JʕydIŮƕˣ«¾·Å�ˤǸoðqɊľSœTUȧn�d
|IĐĄȜl]jûĒ£Å��Ůƕˣ¸£À«¾·Å�ˤǸożņðlǭȋñŶkǴɾ
Ė]dJ � ĐĄȜI�¾��¿Ã�IĪɿĖorIWaikato œśˣ§»Å�žåˤkˁȨ
[�d weka3-6-10 [14]�ȠNdJnQ xmeans kr�¾��ƻrĆđ¢�¤�ɴNn
S�ɪĕȫoǽŠ[��SIbqɊľl]jr 2L10 �ÌPdJ
��������š˟ɗǡ�
Weka o���¾��¿Ã�qšɴɗǡqȣ˖�Ŀ 13 oȴ_Jš˟qɗǡǑʱn�
¾��qíƻr˪ílĊƾ[�dJĮ�¾��oijy��Ôèqƻlđį�ɶ 9 oȴ_J
�¾�� 3 S 75ˢ�Ġ|I�¾�� 0 S 15ˢln�IǶ�q�¾�� 1I2 rb�c
� 4%, 7%l�`Rnđį�Ġ|�J
Ŀ 13. �¾��¿Ã�qąĒɗǡx �¾��qʺƏqðrŪƻȑɃ 3 àÝÊćƫɶȴˤ
ɶ �� Į�¾��oijy��Ôèƻlbqđį�
�/756*4� ��� Ôèƻ� đį����
� ���� ���
�� �� ��
�� ��� ��
� �� � ���
� Ŀ 14 oĮ�¾��qʺƏ²�¤À�ȴ_Jc1, c2 ohNjrIʂʹ�Ɍğo_�d
|oI�¾��¿Ã�oçȠ]dðkrnUIĮÂ��¢� �˃ƻq��ŤðkM
� c1+d1 ydr c2+d2 �°Â ¤]dJYqɗǡ��IœŒƻ�Ġ|��¾�� 3 r
ĮŮƕS 0 Ûʣqð�Ġ|�I_n�fżņȫnŚɉŏĕqȖƎ�}h�ÀÅ°kM�
YlS�R�J�¾�� 0 r d1 _n�fIŐqǑœð(Ŀ 8, ɶ 3 ĤȒ)SœTN�ÀÅ
°�ȴ]jQ�IÆǀI �¾�� 1,2r d1Sżņ��}Ū[N�ÀÅ°�ȴ]jN�J
ǗǛrIc1+d1 l d2, d1 l c2+c2 rwxɄ]Nðon�r`kM�qkI�¾��
˨r c1+d1 l d2 SœTUȧn�YlR�IǧȟqŚɉŏĕl]jrȧźnȖƎ�}h
£Å�kM�YlS�R�JnQIǑʱn�¾��ƻ~ʺƏ²�¤ÀqœTnõIJrĉ
Ǖð�ŏĖ[ajäſRʊɴ[aj}ŏ��nNYl�ȳʍ]jN�J
Ŀ 14. ǴɾĖ]dĮ�¾��qʺƏ²�¤À
Ŀ 15 rIšˌqȣöˣŅĿˤÉoǼ|��d�¾��q ID qĆŸ�ȴ]d}qk
M�JȾȪrȄQ�tDzɲðkM�J2009 ŽI 2010 Žor�¾�� 3 eWkMg
�¾��ʺƏ²�¤À
d}qSI2011 ŽR�bqÚq�¾���ȟ^��OongjN�JY�rĻŁq
ǧȟqŽZlqŏĖqȖƎ�Ǵ]Ul�PjN�qkrnUI̧ £¿Ã�q��Ã¥
�˛ňqÍɂkqʏŶqœT[~I�Ã�qɖŽŏĖ�ȴ]jN�}qR}]�nNJ
de]Iȧźn�¾�� 2 rQ}oʷŷʶIȄŲʶořń]I�¾�� 0 rĞʶI�
¾�� 1 rȄŲÛʣlNOIä�RqȾ˂ȫnȖƎ�ȴ]jN�ĪɨƕSM�J
� (1) 2009 Ž (2) 2010 Ž
� �
� � � � � � � (3) 2011 Ž (4) 2012 Ž
Ŀ 15.� 2009L2012 Žq�¾��ĆŸ
Į�¾��qÜɶȫnNJɏĈ£Å�lbq¸£¿Ã�ɗǡ�Ŀ 16 oȴ_JYqɗǡ
R�}�¾�� 2 rǧȟl]jrȧźn£Å�kM�YlS�R�J
(a) cluster 0 q£Å�è
(b) cluster1 q£Å�è
(c) cluster2 q£Å�è
(d) cluster3 q£Å�è
Ŀ 16 Į�¾��qÜɶ£Å�lbq¸£¿Ã�ɗǡJ(ʙ£Å�Iɛ¸£À)
� YYkIĿ 15 �Ŀ 17 qĻŁŅǀqǧȟĆŸlǹʞ]jz�JĻŁorIźɛžɳ
ǰǟIɲɳžɳǰǟIʽɳǰnmSĆŸ_�SIȚNJȑkrYq�¾��¿Ã�ɗǡr
ǧȟĆŸqȖƎ�Ǵ]UĦLj]jNnNȗƜoɽP�Jǧȟ�ĦLjkT��Oo�¾�
�¿Ã��ƶɯ_�orIȚńçȠ]jN� 8 hq¸£À«¾·Å��mq�OoçȠ
]jNURlNOYl}ØƉǨʅ]nW�sn�nNJnQIšˌrɵLJȣöq�³�
¤ÀR�Óȉ[��ǧȟl}ǹʞ_vTkMgdSIØļr˂oį�nRgdqkIØ
ƉqǨʅʑ˜l]dNJ
Ŀ 18.ĻŁŅǀqǧȟĆŸĿ[15]
5. ɔ��o
ǗȲȽr MAP ǿlǑŬǿ�ȠNdǧȟƪǭqNJȾ˂ŏĕqɘʄȫn¸£¿Ã�o
ŧ]jIȾ˂ʂöſqˠN(250m/pixel)ǧȟƪǭ MODIS EVI o��ĻŁqǧȟuq
ʱȠš˟�Ǩʈ]dJydIƋ��d¸£À«¾·Å�oŧ_��¾��¿Ã�R� š˟kr 13 Žįʄ 299 Ǡq£Å�oŧ]jIİű[2008]Sưǥ]dƢǿkš˟�
ɴgdJÓôš˟o��IML ǿIMAP ǿqN`�oQNj}I�¾�¥��Ã¥�
ǀƁSŝŠȫnʂ�Ǽ|�YloǓĔkM�YlS�RgdJydIʊɴļƻ 10 ļ
ȺſkęĆkM�YlSĆRgdJ
Y��qƞǡ�ȟR]jɺĐ(2014)qƢǿk HadoopIMapReduce o��NJɏĈƦ
ąl¸£¿Ã�qĆƺĄȜ�ɴgdJ ¸£À«¾·Å�oŧ_� x-means ǿo���¾��¿Ã�o�gj 4 Ȼ˝o�
¾��¿Ã��kTdSIšˌqǧȟĆŸŅĿlǹv�lIǧȟĆŸqȖƎ�ĦLj_
��¾��¿Ã�SšȚkTdlrNPnNJǧȕqřń]nN˛ňq�¾��¿Ã
�ɗǡ~I¸£¿Ã�qŖƸˣȖo«¾·Å� d1, d2ˤo�¾��¿Ã�qɗǡSƵ
ʸ[�j]ygdYlSġĽlƓ���JydNJ˂ll}oĆŸSŏĖ_�lNOI
ȧźnȖƎ}M�qkIØƉqǨʈSƐɻkM�Jde]I¸£¿Ã�R�ˠdzqȭ
ʕȨɽoɫ�°Â���šˌoȳʍkTdYlrIǗȲȽqƞǡlNP�J
ØƉrIʽɳǰIźɛ(ɲɳ)žɳǰnmǧȟĆŸ�Ċċ~ʿǕŏĕqȜʂuþãȫ
oƒȠ_�d|IȚŅʒǣq£Å�lǹʞ]nS�I£Å�qʳċI¸£¿Ã�qɪ
Ȣſqčˆnmo�gjIšȠĪɨnƕɨ�ʯƞ]Išˌqĸ˜oʱȠ_vUǨʅ]
jNUYlSƐɻlɤP�JydI¸£¿Ã�o˃]jrĆƺĄȜo�gj}yeʿ
NJ˂qʄɇNJ˂SƐɻl[��d|I®��ÀǏTʢz~�Ūű ¥qďȆI[
�o GPGPU nmo��ÎĈĄȜo��ˠʨĖohNj}Ǩʅ_�YlSƐɻkM
�J
ʔʡ � � ǗȲȽ�ʫ|�oMd�IƪũƹŞȜśʶƻȜƙŊȷśȷǗȡȜƗăƹƬR�ÇŦ
RhȔƏnZƪũ�ʗ�y]dJYYkƛʔqƚ�ɶ]y_Jɒʶqƪũ�˘NdI
ǦĹŔ[�ǝǗȭÒùʠoƛʔ]y_J
ĤɤƽȘ [1] Rie Honda, Temporal modeling and missing data estimation for Modis
vegetation data, The 2nd NASA data mining workshop, 2006. [2] İű� ķȃ, ŅțʁȉɵLJo��ǧȟƪǭ£Å�R�qʿǕŽŏĕqƦą� AMLǿDMAP ǿ�ȠNjA, ˠȭœśĝǩʓƽ, 2008.
[3] ŰÊ� NjÒ, ǧȟƪǭqNJȾ˂ŏĕqɘʄśȫ¸£¿Ã�� BGIMMS NDVI £Å
�o��ǨʅB, ˠȭœśĝǩʓƽ, 2009.� [4] Éȡ� ɱÐ, ǧȟƪǭqNJȾ˂ŏĕµ�§Ã�� BXgrid o��ˠʨĖlËʱį£
Å�qǽŠǖo��®�À�¿Ã�B, ˠȭœśĝǩʓƽ, 2010.�
[5] ɺĐ� ú, NJɏĈɵLJȣöqĆƺĄȜ��¢¶ǫɋlNJȾ˂ŏĕo˃_�ȭʕȨɽ, ˠȭœśĝǩʓƽ, 2014.�
[6] JAXA ŜşɭȾȲȽˁȨDZǫ -Japan Aerospace Exploration Agency-, �Ã��
ČȠ]dʁȉǀǿ,
http://www.eorc.jaxa.jp/hatoyama/experience/rm_kiso/mecha_howto.html� [7] Amazon Web Services, Inc., AWS-Amazon Web Services,
http://aws.amazon.com/jp/, 2015. [8] LP DACC, Land Processes Distributed Active Archive Center,
https://lpdaac.usgs.gov, 2014.
[9] Roy speneer. Ph. D., climatologist, author, foemer NASA scientist, http://www.drroyspencer.com/2012/05, 2012.
WIKIPEDIA , Treea (satellite),
http://en.wikipedia.org/wiki/Terra_(satellite),2015. [10] The HDF Group, HDF Group -HDF4- The HDF Group,
http://www.hdfgroup.org/products/,2014.
[11] The HDF Group, HDFView - The HDF Group, http://www.hdfgroup.org/products/java/hdfview/,2014.
[12] Advanced MIcro Devices, Inc., Aparapi - AMD,
http://developer.amd.com/tools-and-sdks/opencl-zone/aparapi, 2014. [13] Ȱű� ƖƝ, �¾��Åƻ�ɪĕǽŠ_� k-means�À�¿�¶qƧƂohNj, œśûʊ�Ã�ÅȲȽˁȨʶƙŊĄȜʶˀ, 2000.
[14] THE UNIVERSITY OF WAIKATO,Weka -The University of Waikato, http://www.cs.waikato.ac.nz/ml/weka/downloading.html
[15] İű� ʵÕ, ǂǗqǧǴ -ǜėœśǧȕł- Tohoku University,
http://www.biology.tohoku.ac.jp/garden/forest-japan.html �
i
A. 4$3<1@.\q�8>(;9
A.1 �a��
A.1.1 �a���u�P!T�O�8>(;9 pointex.c
#include <stdio.h>
#include <stdlib.h>
#define IM 4800
int main(int argc, char *argv[]) {
short buf[IM];
FILE *fp;
int i;
int size;
int x;
int y;
int width=IM;
int height=IM;
long offset;
if(argc != 4){
printf("ir�r�����¥n");
printf("F�s:%s 4$3<6#$=�WQ x y ¥n",argv[0]);
exit(1);
}
x=atoi(argv[2]);
y=atoi(argv[3]);
fp=fopen(argv[1],"rb");
if(fp == NULL){
printf("%s6#$=���� ��¥n",argv[1]);
return 0;
}
offset=(width*y)*sizeof(short);
ii
fseek(fp, offset,SEEK_SET);
size=fread(buf,sizeof(buf),1,fp);
printf("%d",buf[x]);
printf("¥n");
fclose(fp);
return 0;
}
iii
A.1.2 �a��nO8>(;9AoN��*%=+'<82 ex.sh
#!/bin/bash
rm run.sh
touch run.sh
for i in $( ls *.hdf | sed s/".hdf"//g ); do
#hdp dumpsds -n '"1 km 16 days EVI"' -o $i.evi -b $i.hdf
echo hdp dumpsds -n '"250m 16 days EVI"' -o $i.evi -b $i.hdf >>
run.sh
done
chmod +x run.sh
./run.sh
for i in $( ls *.hdf | sed s/".hdf"//g ); do
#ls -l $i.evi
pointex $i.evi 10 2410
done
iv
A.2 �j�Z
A.2.1 8>(;9yD squ.c
#include <stdio.h>
#include <stdlib.h>
#define IM 4800
int main(int argc, char *argv[]) {
short buf[IM];
FILE *fp,*fpw;
int i,j;
int size;
//int x=1;
//int y=3584;
int x;//w�O��I�_�� xh}
int y;//w�O��I�_�� yh}
int xw;//w�O��I�e
int yw;//w�O��I��
int w;
int width=IM;
int height=IM;
long offset;
if(argc != 7){
printf("ir�r�����¥n");
printf("F�s:%s 4$3<6#$=�WQ OR6#$=WQ� x
y(_�)� width height ¥n",argv[0]);
exit(1);
}
x=atoi(argv[3]);
y=atoi(argv[4]);
xw=atoi(argv[5]);
yw=atoi(argv[6]);
v
fp=fopen(argv[1],"rb");
if(fp == NULL){
printf("%s6#$=���� ��¥n",argv[1]);
return 0;
}
/* OR6#$=�&@8? */
if ( (fpw = fopen(argv[2], "wb"))==NULL ){
printf("6#$=!&@8?��� " ¥n");
exit(1);
}
//y�����
offset=(width*y)*sizeof(short);
fseek(fp, offset,SEEK_SET);
for (j=0; j< yw; ++j) {
offset=(x)*sizeof(short);
fseek(fp, offset,SEEK_CUR);
size=fread(buf,sizeof(short),xw,fp);
//6#$=�A���w�O�
printf("%d %d¥n",j, buf[0]);
fwrite(buf,sizeof(short),xw,fpw);
offset=(width-x-xw)*sizeof(short);
fseek(fp, offset,SEEK_CUR);
}
printf("¥n");
fclose(fp);
fclose(fpw);
return 0;
}
vi
A.2.2� �j�ZnO8>(;9AoN��*%=+'<82� ex_squ.sh�
�
#!/bin/bash
num=0
for i in $( ls *.evi | sort | sed s/".evi"//g ); do
num=$(( $num +1 ))
echo $num
# ls -l $i.evi
# ./squ $i.evi shikoku/"$num"_$i.s.evi 0 2400 1000 1000
#./squ $i.evi shikoku_0_2400_1000_1000/$i.s.evi 0 2400 1000 1000
mkdir shikoku_30_2600_1000_1000
./squ $i.evi shikoku_30_2600_1000_1000/"$num"_$i.s.evi 30 2600 1000 1000
done
�
vii
B.Hadoop ��a6#$=
B.1. hadoop-env.sh
# Set Hadoop-specific environment variables here.
# The only required environment variable is JAVA_HOME. All others are
# optional. When running a distributed configuration it is best to
# set JAVA_HOME in this file, so that it is correctly defined on
# remote nodes.
# The java implementation to use. Required.
# export JAVA_HOME=/usr/lib/j2sdk1.5-sun
export
JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/H
ome
# Extra Java CLASSPATH elements. Optional.
# export HADOOP_CLASSPATH=
# The maximum amount of heap to use, in MB. Default is 1000.
# export HADOOP_HEAPSIZE=2000
export HADOOP_HEAPSIZE=3000
# Extra Java runtime options. Empty by default.
# export HADOOP_OPTS=-server
# Command specific options appended to HADOOP_OPTS when specified
export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_NAMENODE_OPTS"
export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_SECONDARYNAMENODE_OPTS"
export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_DATANODE_OPTS"
viii
export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_BALANCER_OPTS"
export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_JOBTRACKER_OPTS"
# export HADOOP_TASKTRACKER_OPTS=
# The following applies to multiple commands (fs, dfs, fsck, distcp etc)
# export HADOOP_CLIENT_OPTS
# Extra ssh options. Empty by default.
# export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"
# Where log files are stored. $HADOOP_HOME/logs by default.
# export HADOOP_LOG_DIR=${HADOOP_HOME}/logs
export HADOOP_LOG_DIR=/private/var/netboot/Users/Shared/b113k289/logs
# File naming remote slave hosts. $HADOOP_HOME/conf/slaves by default.
# export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves
# host:path where hadoop code should be rsync'd from. Unset by default.
# export HADOOP_MASTER=master:/home/$USER/src/hadoop
# Seconds to sleep between slave commands. Unset by default. This
# can be useful in large clusters, where, e.g., slave rsyncs can
# otherwise arrive faster than the master can service them.
# export HADOOP_SLAVE_SLEEP=0.1
# The directory where pid files are stored. /tmp by default.
# NOTE: this should be set to a directory that can only be written to by
# the users that are going to run the hadoop daemons. Otherwise there
is
# the potential for a symlink attack.
# export HADOOP_PID_DIR=/var/hadoop/pids
ix
# A string representing this instance of hadoop. $USER by default.
# export HADOOP_IDENT_STRING=$USER
# The scheduling priority for daemon processes. See 'man nice'.
# export HADOOP_NICENESS=10
export HADOOP_HOME_WARN_SUPPRESS="TRUE"
export JAVA_OPTS="-Dfile.encoding=UTF-8 -Djava.awt.headless=true"
x
B.2.hdfs-site.xml
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>dfs.replication</name>
<value>3</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>/private/var/netboot/Users/Shared/b113k289/name</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>/private/var/netboot/Users/Shared/b113k289/data</value>
</property>
<property>
<name>dfs.block.size</name>
<value>33554432</value>
</property>
</configuration>
xi
B.3. core-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/private/var/netboot/Users/Shared/b113k289/tmp/hadoop-${user.na
me}</value>
</property>
<property>
<name>fs.checkpoint.dir</name>
<value>/private/var/netboot/Users/Shared/b113k289/checkpoint</value>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://pc1-56:9000</value>
</property>
<property>
<name>hadoop.native.lib</name>
<value>false</value>
</property>
<property>
<name>io.file.buffer.size</name>
<value>4096</value>
xii
<description>
default 4KB(4096)
eulogy 128KB(131072)
</description>
</property>
</configuration>
xiii
B.4.mapred-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>pc1-55:9001</value>
</property>
<property>
<name>mapred.system.dir</name>
<value>/private/var/netboot/Users/Shared/b113k289/system</value>
</property>
<property>
<name>mapred.tasktracker.map.tasks.maximum</name>
<value>1</value>
</property>
<property>
<name>mapred.tasktracker.reduce.tasks.maximum</name>
<value>1</value>
</property>
<property>
<name>mapred.map.tasks</name>
<value>20</value>
</property>
<property>
<name>mapred.reduce.tasks</name>
<value>20</value>
</property>
xiv
<property>
<name>mapred.child.java.opts</name>
<value>-Xmx8192m</value>
<description>
memorysize_default:200m
memorysize:8192m
</description>
</property>
<property>
<name>mapred.compress.map.output</name>
<value>true</value>
</property>
<property>
<name>mapred.task.timeout</name>
<value>14400000</value>
<description>The number of milliseconds before a task will
be
terminated if it neither reads an input, writes
an output, nor updates its status string.
</description>
</property>
</configuration>
xv
C. Hadoop ���u�P�nO�8>(;9
C.1. groupkey.java
/* 2BYTE4$3<�I�c��u�PnO�8>(;9
� 6#$=K��u��u�B�zr��r`�� ��������
*/
package key.group.array;
import java.lang.String;
import java.util.Date;
import java.util.Formatter;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.util.GenericOptionsParser;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
public class groupkey extends Configured implements Tool{
@Override
public int run(String args[]) throws Exception{
Configuration conf = getConf();
if (conf == null) {
return -1;
}
xvi
String[] otherArgs = new GenericOptionsParser(conf,
args).getRemainingArgs();
// conf.set("mapred.compress.map.output","true");
//
conf.set("mapred.map.output.compression.codec","org.apache.hadoop.io.c
ompress.GzipCodec");
Job job = new Job(conf, "VIntArray_new");
job.setJarByClass(groupkey.class);
job.setMapperClass(groupkeymapper.class);
job.setReducerClass(groupkeyreducer.class);
job.setInputFormatClass(WholeFileInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
job.setNumReduceTasks(Integer.parseInt(args[2]));
job.setMapOutputKeyClass(VIntWritable.class);
job.setMapOutputValueClass(VIntArrayWritable.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
Formatter formatter = new Formatter();
String outpath = "Out"
+ formatter.format("%1$tm%1$td%1$tH%1$tM%1$tS", new
Date());
FileOutputFormat.setOutputPath(job, new Path(outpath));
job.setPartitionerClass(KeyPartitioner.class);
//FileOutputFormat.setOutputPath(job, new Path(outpath));
return job.waitForCompletion(true) ? 0 : 1;
}
public static void main(String[] args) throws Exception {
int exitCode = ToolRunner.run(new groupkey(), args);
xvii
System.exit(exitCode);
}
}
xviii
C.2. groupeymapper.java
package key.group.array;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.io.BytesWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.VIntWritable;
import org.apache.hadoop.mapreduce.Mapper;
public class groupkeymapper extends Mapper<NullWritable, BytesWritable,
VIntWritable, VIntArrayWritable>{
//public static final int OFFSET=8;
//public static final int XAXIS=1152;
//public static final int YAXIS=1152;
public static final int OFFSET=0;
public static final int XAXIS=1000;
public static final int YAXIS=1000;
public void map(NullWritable key, BytesWritable value, Context context)
throws IOException, InterruptedException {
//int k,data_count=0;//1@.�$?1/'+
//int task_num;//key��U
int total=XAXIS*YAXIS;//1@.��r
int numReducetasks = (context.getNumReduceTasks());
int data_limit;//.+'����1@.)$,
if((total%numReducetasks) == 0){
data_limit = (int)(total/numReducetasks);//.+'�����1@.
xix
)$,
}else{
data_limit = ((int)(total/numReducetasks))+1;//.+'�����
1@.)$,
}
//6#$=��u�!Tk;
List<VIntWritable> valuelist = new ArrayList<VIntWritable>();
// VIntWritable[] vintvalue ;
// = new VIntWritable[data_limit];
Configuration conf = context.getConfiguration();
String filename =conf.get("map.input.file");
String[] url=filename.split("/");
String[] kariname=url[url.length-1].split("_");
//X�E|
// conf.setBoolean("mapred.compress.map.output",true);
//
conf.steClass("mapred.map.output.compression.codec",SnappyCodec.class,
CompressionCodec.class);
// Job job=new Job(conf);
//VITk�����P!Tk
byte[] data = value.getBytes();
BytesWritable bw =new BytesWritable();
for(int i = 0 ; i < numReducetasks; i++) {
valuelist.clear();
valuelist.add(new
VIntWritable(Integer.valueOf(kariname[0])));
valuelist.add(new VIntWritable(i*data_limit));
for(int j =
i*data_limit*2+OFFSET;j<((i+1)*data_limit*2+OFFSET) && j <
(XAXIS*YAXIS*2+OFFSET);j+=2) {
bw.set(data,j,2);
byte[] bws=bw.getBytes();
xx
int vi=(bws[1]<<8)+(bws[0]&0xFF);
valuelist.add(new VIntWritable(vi));
}
VIntWritable[] valuearray =
(VIntWritable[])valuelist.toArray(new VIntWritable[valuelist.size()]);
//System.out.println("numreducetasks:"+numReducetasks+",datalimit:"+da
ta_limit+",start_j:"+(i*data_limit*2+OFFSET)+",end_j:"+((i+1)*data_lim
it*2+OFFSET)+",valuelist:"+valuelist.size());
context.write(new VIntWritable(i), new
VIntArrayWritable(valuearray));
}
//�I��¡�J1@.�����1@.!Tk
/* for(int j=0;j<XAXIS;j+=1){//)?8<?(e����\v��m
for(int i=0;i<YAXIS;i+=1){//)?8<?(e����\v��m
//�I�� VIHTk
k=2*XAXIS*j+2*i+OFFSET;
bw.set(data,k,2);
byte[] bws=bw.getBytes();
int vi=(bws[1]<<8)+(bws[0]&0xFF);
int context_key=j*1152+i;//h}
//1@.�$?1/'+
data_count=context_key%data_limit;
vintvalue[data_count]=new
VIntWritable(Integer.valueOf(kariname[0]));//tu
vintvalue[data_count]=new VIntWritable(vi);//{�
vintvalue[data_count]=new VIntWritable(context_key);//h}
//data_limit�r��H!Tk��H! Reduce���
if (data_count == data_limit-1){
task_num= (int)
Math.floor(context_key/data_limit);//keyEl
xxi
context.write(new VIntWritable(task_num), new
VIntTwoDArrayWritable(vintvalue));
}
}
}
*/
}
}
xxii
C.3. groupekeyreducer.java
package key.group.array;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import java.util.Map.Entry;
import java.util.TreeMap;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.VIntWritable;
import org.apache.hadoop.io.Writable;
import org.apache.hadoop.mapreduce.Reducer;
public class groupkeyreducer
extends Reducer<VIntWritable, VIntArrayWritable, Text, Text> {
//public static final int XAXIS=1152;
//public static final int YAXIS=1152;
public static final int XAXIS=1000;
public static final int YAXIS=1000;
@Override
public void reduce(VIntWritable key, Iterable<VIntArrayWritable>
values,
Context context)
throws IOException, InterruptedException {
TreeMap<Integer,Integer[]> tree =new
TreeMap<Integer,Integer[]>();
List<Integer> valuelist = new ArrayList<Integer>();
int axis=0;
for(VIntArrayWritable value:values) {
valuelist.clear();
Writable[] val = value.get();
for(int i = 2;i<val.length;i++) {
xxiii
valuelist.add(((VIntWritable)val[i]).get());
}
tree.put(((VIntWritable)val[0]).get(),valuelist.toArray(new
Integer[valuelist.size()]));
axis = ((VIntWritable)val[1]).get();
}
// int axis_start = tree.get(tree.firstKey())[0];
valuelist = null;
values = null;
System.gc();
int[][] datalist = new
int[tree.get(tree.firstKey()).length][tree.size()];
Integer[] data_tmp;
int count=0;
for(Entry<Integer,Integer[]> e:tree.entrySet()) {
data_tmp = e.getValue();
for(int i=0;i<data_tmp.length;i++) {
datalist[i][count] = data_tmp[i].intValue();
}
count++;
}
// Integer[] keyarray = tree.keySet().toArray(new
Integer[valuelist.size()]);
//System.out.printf("axis%d¥n",axis);
int x = axis%XAXIS;
int y = axis/YAXIS;
int j;
StringBuilder strbuil;
int[] stat;
for(int i=0;i<datalist.length;i++) {
// Integer[] tmplist = tree.get
stat = new int[6];
xxiv
x = (axis+i)%XAXIS;
y = (axis+i)/YAXIS;
// key�OR
//String keyaxis = String.format("%4d,%4d",x,y);
String keyaxis = String.format("x.%d,y.%d",x,y);
strbuil = new StringBuilder(3*datalist[0].length);
strbuil.append("1:");
// value�OR
for(int point:datalist[i]) {
if(stat[0] < point) {
stat[0] = point;
stat[1] = stat[5];
}else if(stat[2] > point) {
stat[2] = point;
stat[3] = stat[5];
}
stat[4]+=point;
stat[5]++;
strbuil.append(point);
strbuil.append(",");
}
strbuil.deleteCharAt(strbuil.length()-1);
// context.write(new Text(keyaxis),new Text("x]��g
"+stat[0]+"tu"+stat[1]+"xd��g"+stat[2]+"tu"+stat[3]+"fYH
"+(double)(stat[4]/stat[5])));
//System.out.println(stat[5]);
context.write(new Text(keyaxis),new
Text(strbuil.toString()));
}
// HashMap<Integer,Integer[]> map_max = new
HashMap<Integer,Integer[]>();
//HashMap<Integer,Integer[]> map_min= new
HashMap<Integer,Integer[]>();
xxv
/* int x=0,y=0;
final int gazou_size=1152;
String str;
for(int i=0;i<XAXIS;i+=1){//)?8<?(e����\v��m
for(int j=0;j<YAXIS;j+=1){//)?8<?(e����\v��m
int zahyou =j*1152+i;
if(map_max.containsKey(zahyou)==true){
Integer [] max_value = map_max.get(zahyou);
//System.out.println(zahyou);
x=(zahyou/gazou_size)*1;//)?8<?(e����\v��m
y=zahyou-((zahyou/gazou_size)*gazou_size);
str="x]��g"+max_value[1]+"tu"+max_value[0]+"xd�
�g"+max_value[4]+"tu"+max_value[3]+"fYH
"+(max_value[5]/max_value[6]);
//System.out.printf("x]u�:%d x]H:%d h}:%d xdu
�:%d xdH:%d �r:%d Gr:%d
¥n",max_value[0],max_value[1],max_value[2],max_value[3],max_value[4],m
ax_value[5],max_value[6]);
context.write(new Text(x+","+y),new Text(str));
}
}
}
*/
}
}
xxvi
C.4. KeyPartitioner.java
package key.group.array;
import org.apache.hadoop.io.VIntWritable;
import org.apache.hadoop.mapreduce.Partitioner;
public class KeyPartitioner extends
Partitioner<VIntWritable,VIntArrayWritable> {
@Override
public int getPartition(VIntWritable key,VIntArrayWritable value,int
numPartitions) {
return key.get();
// return key.getFirstKey();
// Writable[] keydata = key.get();
//
//// int keynum= key.get();
// return ((VIntWritable)keydata[0]).get();
}
}
xxvii
C.5. VIntArrayWritable.java
package key.group.array;
import org.apache.hadoop.io.ArrayWritable;
import org.apache.hadoop.io.VIntWritable;
import org.apache.hadoop.io.Writable;
public class VIntArrayWritable extends ArrayWritable{
public VIntArrayWritable(){
super(VIntWritable.class);
}
public VIntArrayWritable(VIntWritable[] colArr){
super(VIntArrayWritable.class,colArr);
}
public int getFirstKey() {
Writable[] keydata = this.get();
// int keynum=key.get();
return ((VIntWritable)keydata[0]).get();
}
public int getSecondKey() {
Writable[] keydata = this.get();
return ((VIntWritable)keydata[1]).get();
}
}
xxviii
C.6. WholeFileInputFormat.java
package key.group.array;
// cc WholeFileInputFormat An InputFormat for reading a whole file as a
record
import java.io.IOException;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.InputSplit;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.JobContext;
import org.apache.hadoop.mapreduce.RecordReader;
import org.apache.hadoop.mapreduce.lib.input.*;
// vv WholeFileInputFormat
public class WholeFileInputFormat extends FileInputFormat<NullWritable,
BytesWritable> {
public RecordReader<NullWritable, BytesWritable>
createRecordReader(
InputSplit split, TaskAttemptContext context)
throws IOException,
InterruptedException {
return new WholeFileRecordReader(split, context);
}
protected boolean isSplitable(JobContext context, Path filename) {
return false;
}
}
// ^^ WholeFileInputFormat
xxix
C.7. WholeFileRecordReader.java
package key.group.array;
// cc WholeFileRecordReader The RecordReader used by WholeFileInputFormat
for reading a whole file as a record
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.*;
// vv WholeFileRecordReader
public class WholeFileRecordReader
extends RecordReader<NullWritable, BytesWritable> {
private FileSplit fileSplit;
private Configuration conf;
private boolean processed = false;
// private LongWritable key = new LongWritable();
private BytesWritable value = new BytesWritable();
public WholeFileRecordReader(InputSplit inputSplit,
TaskAttemptContext context)
throws IOException, InterruptedException {
initialize(inputSplit, context);
}
public void initialize(InputSplit inputSplit, TaskAttemptContext
context)
throws IOException, InterruptedException {
this.fileSplit = (FileSplit)inputSplit;
this.conf = context.getConfiguration();
}
xxx
public NullWritable getCurrentKey()
throws IOException, InterruptedException {
return NullWritable.get();
}
public BytesWritable getCurrentValue()
throws IOException, InterruptedException {
return value;
}
public float getProgress()
throws IOException, InterruptedException {
return processed ? 1.0f : 0.0f;
}
public boolean nextKeyValue()
throws IOException, InterruptedException {
if (!processed) {
byte[] contents = new byte[(int) fileSplit.getLength()];
Path file = fileSplit.getPath();
FileSystem fs = file.getFileSystem(conf);
FSDataInputStream in = null;
try {
in = fs.open(file);
in.readFully(0, contents, 0, contents.length);
value.set(contents, 0, contents.length);
conf.set("map.input.file",
conf.get("mapred.input.dir") + "/" + file.getName());
} finally {
in.close();
}
processed = true;
return true;
}
return false;
}
xxxi
public void close() throws IOException {
// do nothing
}
}
// ^^ WholeFileRecordReader
xxxii
D. Hadoop���:1<?(��8>(;9
D.1. DoubleArrayWritable.java
package spacetotimeline;
import org.apache.hadoop.io.ArrayWritable;
import org.apache.hadoop.io.DoubleWritable;
import org.apache.hadoop.io.Writable;
public class DoubleArrayWritable extends ArrayWritable{
public DoubleArrayWritable(){
super(DoubleWritable.class);
}
public DoubleArrayWritable(DoubleWritable[] colArr){
super(DoubleArrayWritable.class,colArr);
}
public double getFirstKey() {
Writable[] keydata = this.get();
// int keynum=key.get();
return ((DoubleWritable)keydata[0]).get();
}
public double getSecondKey() {
Writable[] keydata = this.get();
return ((DoubleWritable)keydata[1]).get();
}
}
xxxiii
D.2. KeyPartitioner.java
package spacetotimeline;
import org.apache.hadoop.io.VIntWritable;
import org.apache.hadoop.mapreduce.Partitioner;
public class KeyPartitioner extends
Partitioner<VIntWritable,DoubleArrayWritable> {
@Override
public int getPartition(VIntWritable key,DoubleArrayWritable
value,int numPartitions) {
return key.get();
// return key.getFirstKey();
// Writable[] keydata = key.get();
//
//// int keynum=key.get();
// return ((VIntWritable)keydata[0]).get();
}
}
xxxiv
D.3. MeteorologicalDriver.java
package spacetotimeline;
import java.util.Date;
import java.util.Formatter;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.VIntWritable;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
public class MeteorologicalDriver extends Configured implements Tool {
@Override
public int run(String args[]) throws Exception{
Configuration conf = getConf();
if (conf == null) {
return -1;
}
String[] otherArgs = new GenericOptionsParser(conf,
args).getRemainingArgs();
// conf.set("mapred.compress.map.output","true");
//
conf.set("mapred.map.output.compression.codec","org.apache.hadoop.
io.compress.GzipCodec");
Job job = new Job(conf, "SpaceToTimeline_Meteorological");
xxxv
job.setJarByClass(MeteorologicalDriver.class);
job.setMapperClass(MeteorologicalMapper.class);
job.setReducerClass(MeteorologicalReducer.class);
job.setInputFormatClass(WholeFileInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
job.setNumReduceTasks(Integer.parseInt(args[2]));
job.setMapOutputKeyClass(VIntWritable.class);
job.setMapOutputValueClass(DoubleArrayWritable.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
Formatter formatter = new Formatter();
// String outpath = "Out"
// + formatter.format("%1$tm%1$td%1$tH%1$tM%1$tS",
new Date());
// FileOutputFormat.setOutputPath(job, new Path(outpath));
job.setPartitionerClass(KeyPartitioner.class);
FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
return job.waitForCompletion(true) ? 0 : 1;
}
public static void main(String[] args) throws Exception {
int exitCode = ToolRunner.run(new MeteorologicalDriver(), args);
System.exit(exitCode);
}
}
xxxvi
D.4. MeteorologicalMapper.java
package spacetotimeline;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.io.BytesWritable;
import org.apache.hadoop.io.DoubleWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.VIntWritable;
import org.apache.hadoop.mapreduce.Mapper;
public class MeteorologicalMapper extends Mapper<NullWritable,
BytesWritable, VIntWritable, DoubleArrayWritable>{
// ���I�[V��a
// http://weather.is.kochi-u.ac.jp/wiki/archive�ALL
//public static final int OFFSET=43;
//public static final int XAXIS=560;
//public static final int YAXIS=560;
//public static final int BLOCK=1; //fY�B^H!���7>/'�SC
// {�p}�[V��a
// public static final int OFFSET=8;
// public static final int XAXIS=1152;
// public static final int YAXIS=1152;
//250m{�p}�[V��a
public static final int OFFSET=0;
public static final int XAXIS=100;//0+2b�
public static final int YAXIS=100;
//public static final int XAXIS=1000;//b�
xxxvii
//public static final int YAXIS=1000;
public static final int BLOCK=1;
public static final int DEPTH=2;//1pixel�4$2r
public void map(NullWritable key, BytesWritable value, Context context)
throws IOException, InterruptedException {
int total = XAXIS*YAXIS;//1@.��r
int blocktotal = (XAXIS/BLOCK)*(YAXIS/BLOCK);//7>/'����5'
-=r
int numReducetasks = (context.getNumReduceTasks());// reduce.+
'
int data_limit;//.+'����1@.)$,
//.+'�����1@.)$,
if((blocktotal%numReducetasks) == 0){
data_limit = (int)(blocktotal/numReducetasks);
}else{
data_limit = ((int)(blocktotal/numReducetasks))+1;
}
/* if((total%numReducetasks) == 0){
data_limit = (int)(total/numReducetasks);
}else{
data_limit = ((int)(total/numReducetasks))+1;
}
*/
//6#$=��u�!Tk;
List<DoubleWritable> valuelist = new ArrayList<DoubleWritable>();
Configuration conf = context.getConfiguration();
String filename =conf.get("map.input.file");
String[] url=filename.split("/");
//int time = Integer.parseInt(url[url.length-1].substring(4,12));
int time = Integer.parseInt(url[url.length-1].substring(9,16));
//VITk�����P!Tk
byte[] data = value.getBytes();
xxxviii
System.out.println("datalength="+data.length);
if(data.length<XAXIS*YAXIS*DEPTH) {
//LR1@.�~�����)$,�0�[V
for(int i = 0 ; i < numReducetasks; i++) {
valuelist.clear();
valuelist.add(new DoubleWritable(time));
valuelist.add(new DoubleWritable(i*data_limit));
if(i < numReducetasks-1) {
for(int j=0;j<data_limit;j++) {
valuelist.add(new DoubleWritable(-100));
}
//
bw.set(data,i*data_limit+OFFSET,data_limit);
}else {
for(int
j=0;j<data_limit-((data_limit*numReducetasks)-blocktotal);j++) {
valuelist.add(new DoubleWritable(-100));
}
//
bw.set(data,i*data_limit+OFFSET,data_limit-((data_limit*numReducetasks
)-total));
}
DoubleWritable[] valuearray =
(DoubleWritable[])valuelist.toArray(new
DoubleWritable[valuelist.size()]);
context.write(new VIntWritable(i), new
DoubleArrayWritable(valuearray));
}
}else{
//LR1@.�� �1Byte
// BytesWritable bw =new BytesWritable();
xxxix
byte[] bws ;
double[] blocktmp;
byte[][] arraydata = new byte[YAXIS][XAXIS];
int count;
double sum;
List<DoubleWritable> meandata = new
ArrayList<DoubleWritable>();
//MD�=@8
for(int j = 0 ; j < YAXIS-(BLOCK-1) ; j+=BLOCK) {
for(int i = 0 ; i < XAXIS-(BLOCK-1) ; i+=BLOCK) {
blocktmp = new double[BLOCK*BLOCK];
// System.out.printf("x.%4d y.%4d��%d�7>/'!nO
¥n",(j*XAXIS+i)%XAXIS,(j*XAXIS+i)/YAXIS,BLOCK);
// arraydata[j][i] = data[j*YAXIS + i + OFFSET];
//7>/'�=@8
for(int l = 0 ; l < BLOCK ; l++) {
for(int k = 0 ; k < BLOCK ; k++) {
// System.out.printf("x.%4d
y.%4d : %d¥n",(j*XAXIS+i+l*XAXIS+k)%XAXIS,(j*XAXIS+i+l*XAXIS+k)/YAXIS,
(int)data[OFFSET+j*XAXIS+i+l*XAXIS+k]&0xFF);
blocktmp[l*BLOCK+k] =
(int)data[OFFSET+j*XAXIS+i+l*XAXIS+k]&0xFF;
}
}
count=0;
sum=0;
for(double point : blocktmp) {
if(point>80) {
count++;
sum+=point;
}
}
if(count == 0){
xl
sum=0;
}else{
sum/=count;
}
// System.out.println(sum);
// median�[V
// Arrays.sort(blocktmp);
// sum=blocktmp[(int)Math.floor(count/2)];
meandata.add(new DoubleWritable(sum));
}
}
System.out.println("meandata"+meandata.size());
//int[][] intarraydata = new
int[(int)Math.floor(YAXIS/BLOCK)][(int)Math.floor(XAXIS/BLOCK)];
for(int i = 0 ; i < numReducetasks; i++) {
System.out.println("meandata"+meandata.size());
valuelist.clear();
System.out.println("meandata"+meandata.size());
if(data_limit*(i+1) > blocktotal) {
System.out.printf("Array get start:%d
end:%d¥n",data_limit*i,blocktotal);
valuelist = meandata.subList(0,meandata.size());
}else {
System.out.printf("Array get start:%d
end:%d¥n",data_limit*i,data_limit*(i+1));
valuelist = meandata.subList(0,data_limit);
}
System.out.println("time"+time+"
i*data_limit:"+data_limit*i);
valuelist.add(0,new DoubleWritable(time*1.0));
valuelist.add(1,new DoubleWritable(i*data_limit*1.0));
xli
DoubleWritable[] valuearray =
(DoubleWritable[])valuelist.toArray(new
DoubleWritable[valuelist.size()]);
//
System.out.println("valuearray_length:"+valuearray.length);
context.write(new VIntWritable(i), new
DoubleArrayWritable(valuearray));
}
/*
for(int i = 0 ; i < numReducetasks; i++) {
valuelist.clear();
valuelist.add(new VIntWritable(time));
valuelist.add(new VIntWritable(i*data_limit));
if(i < numReducetasks-1) {
// bw.set(data,i*data_limit+OFFSET,data_limit);
bws =
Arrays.copyOfRange(data ,i*data_limit+OFFSET,(i+1)*data_limit );
System.out.println("first:"+(i*data_limit+OFFSET)+",end:"+data_limit);
}else {
//
bw.set(data,i*data_limit+OFFSET,data_limit-((data_limit*numReducetasks
)-total));
bws = Arrays.copyOfRange(data,
i*data_limit+OFFSET,data.length);
System.out.println("first:"+(i*data_limit+OFFSET)+",end:"+data_limit);
}
// byte[] bws =
Arrays.copyOfRange(bw.getBytes(),0,bw.getLength());
for(byte point:bws) {
xlii
valuelist.add(new VIntWritable((int)point&0xFF));
}
VIntWritable[] valuearray =
(VIntWritable[])valuelist.toArray(new VIntWritable[valuelist.size()]);
//
System.out.println("valuearray_length:"+valuearray.length);
context.write(new VIntWritable(i), new
VIntArrayWritable(valuearray));
}
*/
}
}
}
xliii
D.5. MeteorologicalReducer.java
package spacetotimeline;
import java.io.IOException;
import java.util.Date;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.List;
import java.util.TreeMap;
import java.util.Map.Entry;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.VIntWritable;
import org.apache.hadoop.io.DoubleWritable;
import org.apache.hadoop.io.Writable;
import org.apache.hadoop.mapreduce.Reducer;
public class MeteorologicalReducer extends Reducer<VIntWritable,
DoubleArrayWritable, Text, Text>{
//public static final int XAXIS=560;
//public static final int YAXIS=560;
//public static final int BLOCK=10;
//public static final int XAXIS=1152;
//public static final int YAXIS=1152;
//public static final int BLOCK=10;
//250m植生指標の場合の設定
public static final int XAXIS=100;//テスト実験
public static final int YAXIS=100;
//public static final int XAXIS=1000;//実験
//public static final int YAXIS=1000;
public static final int BLOCK=1;
xliv
@Override
public void reduce(VIntWritable key, Iterable<DoubleArrayWritable>
values,
Context context)
throws IOException,
InterruptedException {
System.out.println("maxMemory:"+Runtime.getRuntime().maxMemory()/(
1024*1024)+"MB");
System.out.println("totalMemory:"+Runtime.getRuntime().totalMemory
()/(1024*1024)+"MB");
System.out.println("usedMemory:"+(Runtime.getRuntime().totalMemory
() - Runtime.getRuntime().freeMemory())/(1024*1024)+"MB");
TreeMap<Long,Double[]> tree =new TreeMap<Long,Double[]>();
List<Double> valuelist = new ArrayList<Double>();
int axis=0;
SimpleDateFormat format = new SimpleDateFormat("yyMMddHH");
Date date;
//Mapからのデータをtreemapで時間ソートしつつデータ保存
for(DoubleArrayWritable value:values) {
valuelist.clear();
Writable[] val = value.get();
for(int i = 2;i<val.length;i++) {
valuelist.add(((DoubleWritable)val[i]).get());
}
String time_data =
Integer.toString((int)((DoubleWritable)val[0]).get());
// String time_data =
Double.toString(((DoubleWritable)val[0]).get());
// System.out.println(time_data);
xlv
long hour_count=0;
try {
date = format.parse(time_data);
long time = date.getTime()/1000;
hour_count = time/(60*60);
} catch (ParseException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
tree.put(hour_count,valuelist.toArray(new
Double[valuelist.size()]));
axis = (int)((DoubleWritable)val[1]).get();
}
long first_hour = tree.firstKey();
int dist_hour = (int)(tree.lastKey()-tree.firstKey());
Double[] tmp_axis = tree.get(tree.firstKey());
System.out.println("firsttime:"+tree.firstKey()+",Lasttime:"+tree.
lastKey()+",dist_hour"+dist_hour+",first_axis:"+axis+",last_axis"+(axi
s+tmp_axis.length));
System.out.println("totalMemory:"+Runtime.getRuntime().totalMemory
()/(1024*1024)+"MB");
System.out.println("usedMemory:"+(Runtime.getRuntime().totalMemory
() - Runtime.getRuntime().freeMemory())/(1024*1024)+"MB");
values = null;
valuelist = null;
System.gc();
System.out.println("totalMemory:"+Runtime.getRuntime().totalMemory
()/(1024*1024)+"MB");
xlvi
System.out.println("usedMemory:"+(Runtime.getRuntime().totalMemory
() - Runtime.getRuntime().freeMemory())/(1024*1024)+"MB");
// int axis_start = tree.get(tree.firstKey())[0];
//データの初期化
double[][] datalist = new
double[tree.get(tree.firstKey()).length][dist_hour+1];
for(int i=0;i<datalist.length;i++) {
for(int j=0;j<datalist[i].length;j++) {
datalist[i][j] = -100.0;
}
}
System.out.println("usedMemory:"+(Runtime.getRuntime().totalMemory
() - Runtime.getRuntime().freeMemory())/(1024*1024)+"MB");
Double[] data_tmp;
data_tmp = tree.firstEntry().getValue();
for(Entry<Long,Double[]> e:tree.entrySet()) {
data_tmp = e.getValue();
int time = (int)(e.getKey()-first_hour);
for(int i=0;i<data_tmp.length;i++) {
datalist[i][time] = data_tmp[i];
// datalist[i][time] = data_tmp[i].intValue();
}
}
tree.clear();
int x = 0;
int y = 0;
if(axis>0) {
x = axis%(XAXIS/BLOCK);
y = axis/(YAXIS/BLOCK);
xlvii
}
StringBuilder strbuil;
for(int i=0;i<datalist.length;i++) {
if(i>0) {
x = (axis+i)%(XAXIS/BLOCK);
y = (axis+i)/(YAXIS/BLOCK);
}
String keyaxis = String.format("x.%04d,y.%04d",x,y);
// System.out.println(keyaxis+":");
strbuil = new StringBuilder(datalist[0].length*6);
String tmp;
for(double j:datalist[i]) {
// System.out.print(j+",");
tmp = String.format("%.3f",j);
strbuil.append(tmp+",");
}
// System.out.println();
String outdata =
strbuil.toString().substring(0,strbuil.toString().length()-1);
context.write(new Text(keyaxis),new Text(outdata));
}
}
}
xlviii
D.6. VIntArrayWritable.java
package spacetotimeline;
import org.apache.hadoop.io.ArrayWritable;
import org.apache.hadoop.io.VIntWritable;
import org.apache.hadoop.io.Writable;
public class VIntArrayWritable extends ArrayWritable{
public VIntArrayWritable(){
super(VIntWritable.class);
}
public VIntArrayWritable(VIntWritable[] colArr){
super(VIntArrayWritable.class,colArr);
}
public int getFirstKey() {
Writable[] keydata = this.get();
// int keynum=key.get();
return ((VIntWritable)keydata[0]).get();
}
public int getSecondKey() {
Writable[] keydata = this.get();
return ((VIntWritable)keydata[1]).get();
}
}
xlix
D.7.WholeFileInputFormat.java
package spacetotimeline;
// cc WholeFileInputFormat An InputFormat for reading a whole file as a
record
import java.io.IOException;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.InputSplit;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.JobContext;
import org.apache.hadoop.mapreduce.RecordReader;
import org.apache.hadoop.mapreduce.lib.input.*;
// vv WholeFileInputFormat
public class WholeFileInputFormat extends FileInputFormat<NullWritable,
BytesWritable> {
public RecordReader<NullWritable, BytesWritable>
createRecordReader(
InputSplit split, TaskAttemptContext context)
throws IOException,
InterruptedException {
return new WholeFileRecordReader(split, context);
}
protected boolean isSplitable(JobContext context, Path filename) {
return false;
}
}
// ^^ WholeFileInputFormat
l
D.8.WholeFileRecordReader.java
package spacetotimeline;
// cc WholeFileRecordReader The RecordReader used by WholeFileInputFormat
for reading a whole file as a record
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.*;
// vv WholeFileRecordReader
public class WholeFileRecordReader
extends RecordReader<NullWritable, BytesWritable> {
private FileSplit fileSplit;
private Configuration conf;
private boolean processed = false;
// private LongWritable key = new LongWritable();
private BytesWritable value = new BytesWritable();
public WholeFileRecordReader(InputSplit inputSplit,
TaskAttemptContext context)
throws IOException, InterruptedException {
initialize(inputSplit, context);
}
public void initialize(InputSplit inputSplit, TaskAttemptContext
context)
throws IOException, InterruptedException {
this.fileSplit = (FileSplit)inputSplit;
this.conf = context.getConfiguration();
}
li
public NullWritable getCurrentKey()
throws IOException, InterruptedException {
return NullWritable.get();
}
public BytesWritable getCurrentValue()
throws IOException, InterruptedException {
return value;
}
public float getProgress()
throws IOException, InterruptedException {
return processed ? 1.0f : 0.0f;
}
public boolean nextKeyValue()
throws IOException, InterruptedException {
if (!processed) {
byte[] contents = new byte[(int) fileSplit.getLength()];
Path file = fileSplit.getPath();
FileSystem fs = file.getFileSystem(conf);
FSDataInputStream in = null;
try {
in = fs.open(file);
in.readFully(0, contents, 0, contents.length);
value.set(contents, 0, contents.length);
conf.set("map.input.file",
conf.get("mapred.input.dir") + "/" + file.getName());
} finally {
in.close();
}
processed = true;
return true;
lii
}
return false;
}
public void close() throws IOException {
// do nothing
}
}
// ^^ WholeFileRecordReader