2010年7月19日 星期一

海量資料量

source : http://mndoci.com/2010/06/30/massive-data/

Facebook
36 PB of uncompressed data
2250 machines
23,000 cores
32 GB of RAM per machine
processing 80-90TB/day

Yahoo
70 PB of data in HDFS
170 PB spread across the globe
34000 servers
Processing 3 PB per day
120 TB flow through Hadoop every day

Twitter
7 TB/day into HDFS

LinkedIn
120 Billion relationships
82 Hadoop jobs daily (IIRC)
16 TB of intermedia data
2 engineers

WOW........

沒有留言:

張貼留言