First time accepted submitter MadC0der writes “We just signed a project with a very large company. We are a computer vision based company and our project gathers images from a facility from PA. Our company is located in TN. The company we’re gather images from is on a very high speed fiber optic network. However, being a small company of 11 developers, and 1 systems engineer, we’re on a business class 100mb cable connection which works well for us but not in this situation. The information gathered from the client in PA is s 1½mb .bmp image, along with a 3mb Depth map file, making each snapshot a little under 5 megs. This may sound small, but images are taken every 3-5 seconds. This can lead to a very large amount of data captured and transferred each day. Our facility is incapable of handling such large transfers without effecting internal network performance. We’ve come to the conclusion that a cloud service would be the best solution for our problem. We’re now thinking the customer’s workstation will sync the data with the cloud, and we can automate pulling the data during off hours so we won’t encounter congestion for analysis. Can anyone help suggest a stable, fairly price cloud solution that will sync large amounts for offsite data for retrieval at our convenience (nightly Rsync script should handle this process)?… First time accepted submitter MadC0der writes “We just signed a project with a very large company. We are a computer vision based company and our project gathers images from a facility from PA. Our company is located in TN. The company we’re gather images from is on a very high speed fiber optic network. However, being a small company of 11 developers, and 1 systems engineer, we’re on a business class 100mb cable connection which works well for us but not in this situation. The information gathered from the client in PA is s 1½mb .bmp image, along with a 3mb Depth map file, making each snapshot a little under 5 megs. This may sound small, but images are taken every 3-5 seconds. This can lead to a very large amount of data captured and transferred each day. Our facility is incapable of handling such large transfers without effecting internal network performance. We’ve come to the conclusion that a cloud service would be the best solution for our problem. We’re now thinking the customer’s workstation will sync the data with the cloud, and we can automate pulling the data during off hours so we won’t encounter congestion for analysis. Can anyone help suggest a stable, fairly price cloud solution that will sync large amounts for offsite data for retrieval at our convenience (nightly Rsync script should handle this process)?

Read more of this story at Slashdot.






Read more http://rss.slashdot.org/~r/Slashdot/slashdot/~3/3BBGsD3tV0E/story01.htm