Best way to transfer data from on-prem to AWSWhy do people use Heroku when AWS is present? What distinguishes Heroku from AWS?secure file transfer from client to AWSWay to optimize Data Transfer between EBS to EFSAWS S3 data transfer using AWS CLIWhich is the most secure and fast way of transferring large amount of data (SQL Server backup files) to Azure Blob or Azure VM hosting SQLContinuous data ingestion from on prem data sources to redshiftMigrate database and data warehouse into AWSHow to transfer data from Google Cloud Storage to Amazon S3 and get the associated cost?in live Hadoop cluster - Migrating data from on Prem to cloud without copyingFastest Way To Migrate ElasticSearch Data?

What's the "magic similar to the Knock spell" referenced in the Dungeon of the Mad Mage adventure?

How can I avoid subordinates and coworkers leaving work until the last minute, then having no time for revisions?

Company stopped paying my salary. What are my options?

Was the Highlands Ranch shooting the 115th mass shooting in the US in 2019

How to find the tex encoding of specific fonts?

Why do Thanos' punches not kill Captain America or at least cause vital wounds?

Why do unstable nuclei form?

What was the plan for an abort of the Enola Gay's mission to drop the atomic bomb?

What's the difference between const array and static const array in C/C++

Is ‘despite that’ right?

When quoting someone, is it proper to change "gotta" to "got to" without modifying the rest of the quote?

Is it a Munchausen Number?

Are there variations of the regular runtimes of the Big-O-Notation?

How to slow yourself down (for playing nice with others)

My perfect evil overlord plan... or is it?

Why should password hash verification be time consistent?

How to make a language evolve quickly?

Is it a good idea to copy a trader when investing?

Why is it wrong to *implement* myself a known, published, widely believed to be secure crypto algorithm?

Extending Kan fibrations, without using minimal fibrations

Has magnetic core memory been used beyond the Moon?

Is this state of Earth possible, after humans left for a million years?

spatiotemporal regression

What is wrong with my code? RGB potentiometer



Best way to transfer data from on-prem to AWS


Why do people use Heroku when AWS is present? What distinguishes Heroku from AWS?secure file transfer from client to AWSWay to optimize Data Transfer between EBS to EFSAWS S3 data transfer using AWS CLIWhich is the most secure and fast way of transferring large amount of data (SQL Server backup files) to Azure Blob or Azure VM hosting SQLContinuous data ingestion from on prem data sources to redshiftMigrate database and data warehouse into AWSHow to transfer data from Google Cloud Storage to Amazon S3 and get the associated cost?in live Hadoop cluster - Migrating data from on Prem to cloud without copyingFastest Way To Migrate ElasticSearch Data?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








0















I have a requirement to transfer data(one time) from on prem to AWS S3. The data size is around 1 TB. I was going through AWS Datasync, Snowball etc... But these managed services are better to migrate if the data is in petabytes. Can someone suggest me the best way to transfer the data in a secured way cost effectively










share|improve this question



















  • 1





    where does the data need to end up? S3? Database? (based on the tags I guess S3) which timeframe do you have for the transfer?

    – jcuypers
    Mar 23 at 9:52












  • I need to store it in S3. Timings are not a constraint

    – Bharani
    Mar 23 at 9:53


















0















I have a requirement to transfer data(one time) from on prem to AWS S3. The data size is around 1 TB. I was going through AWS Datasync, Snowball etc... But these managed services are better to migrate if the data is in petabytes. Can someone suggest me the best way to transfer the data in a secured way cost effectively










share|improve this question



















  • 1





    where does the data need to end up? S3? Database? (based on the tags I guess S3) which timeframe do you have for the transfer?

    – jcuypers
    Mar 23 at 9:52












  • I need to store it in S3. Timings are not a constraint

    – Bharani
    Mar 23 at 9:53














0












0








0








I have a requirement to transfer data(one time) from on prem to AWS S3. The data size is around 1 TB. I was going through AWS Datasync, Snowball etc... But these managed services are better to migrate if the data is in petabytes. Can someone suggest me the best way to transfer the data in a secured way cost effectively










share|improve this question
















I have a requirement to transfer data(one time) from on prem to AWS S3. The data size is around 1 TB. I was going through AWS Datasync, Snowball etc... But these managed services are better to migrate if the data is in petabytes. Can someone suggest me the best way to transfer the data in a secured way cost effectively







amazon-web-services amazon-s3 migration






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Mar 23 at 10:03







Bharani

















asked Mar 23 at 9:48









BharaniBharani

1341214




1341214







  • 1





    where does the data need to end up? S3? Database? (based on the tags I guess S3) which timeframe do you have for the transfer?

    – jcuypers
    Mar 23 at 9:52












  • I need to store it in S3. Timings are not a constraint

    – Bharani
    Mar 23 at 9:53













  • 1





    where does the data need to end up? S3? Database? (based on the tags I guess S3) which timeframe do you have for the transfer?

    – jcuypers
    Mar 23 at 9:52












  • I need to store it in S3. Timings are not a constraint

    – Bharani
    Mar 23 at 9:53








1




1





where does the data need to end up? S3? Database? (based on the tags I guess S3) which timeframe do you have for the transfer?

– jcuypers
Mar 23 at 9:52






where does the data need to end up? S3? Database? (based on the tags I guess S3) which timeframe do you have for the transfer?

– jcuypers
Mar 23 at 9:52














I need to store it in S3. Timings are not a constraint

– Bharani
Mar 23 at 9:53






I need to store it in S3. Timings are not a constraint

– Bharani
Mar 23 at 9:53













3 Answers
3






active

oldest

votes


















1














If you have no specific requirements (apart from the fact that it needs to be encrypted and the file-size is 1TB) then I would suggest you stick to something plain and simple. S3 supports an object size of 5TB so you wouldn't run into trouble. I don't know if your data is made up of many smaller files or 1 big file (or zip) but in essence its all the same. Since the end-points or all encrypted you should be fine (if your worried, you can encrypt your files before and they will be encrypted while stored (if its backup of something). To get to the point, you can use API tools for transfer or just file-explorer type of tools which have also connectivity to S3 (e.g. https://www.cloudberrylab.com/explorer/amazon-s3.aspx). some other point: cost-effectiviness of storage/transfer all depends on how frequent you need the data, if just a backup or just in case. archiving to glacier is much cheaper.






share|improve this answer






























    1














    You can use the AWS Command-Line Interface (CLI). This command will copy data to Amazon S3:



    aws s3 sync c:/MyDir s3://my-bucket/


    If there is a network failure or timeout, simply run the command again. It only copies files that are not already present in the destination.



    The time taken will depend upon the speed of your Internet connection.



    You could also consider using AWS Snowball, which is a piece of hardware that is sent to your location. It can hold 50TB of data and costs $200.






    share|improve this answer






























      0














      1 TB is large but it's not so large that it'll take you weeks to get your data onto S3. However if you don't have a good upload speed, use Snowball.



      https://aws.amazon.com/snowball/



      Snowball is a device shipped to you which can hold up to 100TB. You load your data onto it and ship it back to AWS and they'll upload it to the S3 bucket you specify when loading the data.






      share|improve this answer























        Your Answer






        StackExchange.ifUsing("editor", function ()
        StackExchange.using("externalEditor", function ()
        StackExchange.using("snippets", function ()
        StackExchange.snippets.init();
        );
        );
        , "code-snippets");

        StackExchange.ready(function()
        var channelOptions =
        tags: "".split(" "),
        id: "1"
        ;
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function()
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled)
        StackExchange.using("snippets", function()
        createEditor();
        );

        else
        createEditor();

        );

        function createEditor()
        StackExchange.prepareEditor(
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: true,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: 10,
        bindNavPrevention: true,
        postfix: "",
        imageUploader:
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        ,
        onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        );



        );













        draft saved

        draft discarded


















        StackExchange.ready(
        function ()
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55312493%2fbest-way-to-transfer-data-from-on-prem-to-aws%23new-answer', 'question_page');

        );

        Post as a guest















        Required, but never shown

























        3 Answers
        3






        active

        oldest

        votes








        3 Answers
        3






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        1














        If you have no specific requirements (apart from the fact that it needs to be encrypted and the file-size is 1TB) then I would suggest you stick to something plain and simple. S3 supports an object size of 5TB so you wouldn't run into trouble. I don't know if your data is made up of many smaller files or 1 big file (or zip) but in essence its all the same. Since the end-points or all encrypted you should be fine (if your worried, you can encrypt your files before and they will be encrypted while stored (if its backup of something). To get to the point, you can use API tools for transfer or just file-explorer type of tools which have also connectivity to S3 (e.g. https://www.cloudberrylab.com/explorer/amazon-s3.aspx). some other point: cost-effectiviness of storage/transfer all depends on how frequent you need the data, if just a backup or just in case. archiving to glacier is much cheaper.






        share|improve this answer



























          1














          If you have no specific requirements (apart from the fact that it needs to be encrypted and the file-size is 1TB) then I would suggest you stick to something plain and simple. S3 supports an object size of 5TB so you wouldn't run into trouble. I don't know if your data is made up of many smaller files or 1 big file (or zip) but in essence its all the same. Since the end-points or all encrypted you should be fine (if your worried, you can encrypt your files before and they will be encrypted while stored (if its backup of something). To get to the point, you can use API tools for transfer or just file-explorer type of tools which have also connectivity to S3 (e.g. https://www.cloudberrylab.com/explorer/amazon-s3.aspx). some other point: cost-effectiviness of storage/transfer all depends on how frequent you need the data, if just a backup or just in case. archiving to glacier is much cheaper.






          share|improve this answer

























            1












            1








            1







            If you have no specific requirements (apart from the fact that it needs to be encrypted and the file-size is 1TB) then I would suggest you stick to something plain and simple. S3 supports an object size of 5TB so you wouldn't run into trouble. I don't know if your data is made up of many smaller files or 1 big file (or zip) but in essence its all the same. Since the end-points or all encrypted you should be fine (if your worried, you can encrypt your files before and they will be encrypted while stored (if its backup of something). To get to the point, you can use API tools for transfer or just file-explorer type of tools which have also connectivity to S3 (e.g. https://www.cloudberrylab.com/explorer/amazon-s3.aspx). some other point: cost-effectiviness of storage/transfer all depends on how frequent you need the data, if just a backup or just in case. archiving to glacier is much cheaper.






            share|improve this answer













            If you have no specific requirements (apart from the fact that it needs to be encrypted and the file-size is 1TB) then I would suggest you stick to something plain and simple. S3 supports an object size of 5TB so you wouldn't run into trouble. I don't know if your data is made up of many smaller files or 1 big file (or zip) but in essence its all the same. Since the end-points or all encrypted you should be fine (if your worried, you can encrypt your files before and they will be encrypted while stored (if its backup of something). To get to the point, you can use API tools for transfer or just file-explorer type of tools which have also connectivity to S3 (e.g. https://www.cloudberrylab.com/explorer/amazon-s3.aspx). some other point: cost-effectiviness of storage/transfer all depends on how frequent you need the data, if just a backup or just in case. archiving to glacier is much cheaper.







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Mar 23 at 10:05









            jcuypersjcuypers

            1,3881119




            1,3881119























                1














                You can use the AWS Command-Line Interface (CLI). This command will copy data to Amazon S3:



                aws s3 sync c:/MyDir s3://my-bucket/


                If there is a network failure or timeout, simply run the command again. It only copies files that are not already present in the destination.



                The time taken will depend upon the speed of your Internet connection.



                You could also consider using AWS Snowball, which is a piece of hardware that is sent to your location. It can hold 50TB of data and costs $200.






                share|improve this answer



























                  1














                  You can use the AWS Command-Line Interface (CLI). This command will copy data to Amazon S3:



                  aws s3 sync c:/MyDir s3://my-bucket/


                  If there is a network failure or timeout, simply run the command again. It only copies files that are not already present in the destination.



                  The time taken will depend upon the speed of your Internet connection.



                  You could also consider using AWS Snowball, which is a piece of hardware that is sent to your location. It can hold 50TB of data and costs $200.






                  share|improve this answer

























                    1












                    1








                    1







                    You can use the AWS Command-Line Interface (CLI). This command will copy data to Amazon S3:



                    aws s3 sync c:/MyDir s3://my-bucket/


                    If there is a network failure or timeout, simply run the command again. It only copies files that are not already present in the destination.



                    The time taken will depend upon the speed of your Internet connection.



                    You could also consider using AWS Snowball, which is a piece of hardware that is sent to your location. It can hold 50TB of data and costs $200.






                    share|improve this answer













                    You can use the AWS Command-Line Interface (CLI). This command will copy data to Amazon S3:



                    aws s3 sync c:/MyDir s3://my-bucket/


                    If there is a network failure or timeout, simply run the command again. It only copies files that are not already present in the destination.



                    The time taken will depend upon the speed of your Internet connection.



                    You could also consider using AWS Snowball, which is a piece of hardware that is sent to your location. It can hold 50TB of data and costs $200.







                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered Mar 23 at 10:47









                    John RotensteinJohn Rotenstein

                    81.3k792144




                    81.3k792144





















                        0














                        1 TB is large but it's not so large that it'll take you weeks to get your data onto S3. However if you don't have a good upload speed, use Snowball.



                        https://aws.amazon.com/snowball/



                        Snowball is a device shipped to you which can hold up to 100TB. You load your data onto it and ship it back to AWS and they'll upload it to the S3 bucket you specify when loading the data.






                        share|improve this answer



























                          0














                          1 TB is large but it's not so large that it'll take you weeks to get your data onto S3. However if you don't have a good upload speed, use Snowball.



                          https://aws.amazon.com/snowball/



                          Snowball is a device shipped to you which can hold up to 100TB. You load your data onto it and ship it back to AWS and they'll upload it to the S3 bucket you specify when loading the data.






                          share|improve this answer

























                            0












                            0








                            0







                            1 TB is large but it's not so large that it'll take you weeks to get your data onto S3. However if you don't have a good upload speed, use Snowball.



                            https://aws.amazon.com/snowball/



                            Snowball is a device shipped to you which can hold up to 100TB. You load your data onto it and ship it back to AWS and they'll upload it to the S3 bucket you specify when loading the data.






                            share|improve this answer













                            1 TB is large but it's not so large that it'll take you weeks to get your data onto S3. However if you don't have a good upload speed, use Snowball.



                            https://aws.amazon.com/snowball/



                            Snowball is a device shipped to you which can hold up to 100TB. You load your data onto it and ship it back to AWS and they'll upload it to the S3 bucket you specify when loading the data.







                            share|improve this answer












                            share|improve this answer



                            share|improve this answer










                            answered Mar 23 at 19:19









                            Jack MarchettiJack Marchetti

                            11.7k1168110




                            11.7k1168110



























                                draft saved

                                draft discarded
















































                                Thanks for contributing an answer to Stack Overflow!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid


                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.

                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function ()
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55312493%2fbest-way-to-transfer-data-from-on-prem-to-aws%23new-answer', 'question_page');

                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

                                SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

                                은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현