Modeling a periodic snapshot in data vault?Data vault for NoSQLData Vault 2.0 in SQL ServerData Vault - How to select business keys?Data Vault in Redshift and ETL StrategyData vault model: what are hubs good for?Data Vault: difference between business, surrogate, retained keysData Vault Model and Lookup TablesTransaction Data in Data Vault ModelHow to implement a Role Play Dimension into Data VaultBest Practices for Vaulting EAV

Would the Life cleric's Disciple of Life feature supercharge the Regenerate spell?

Park the computer

Machine Learning Golf: Multiplication

Motorcyle Chain needs to be cleaned every time you lube it?

When is one 'Ready' to make Original Contributions to Mathematics?

What is the fundamental difference between catching whales and hunting other animals?

What is the difference between an "empty interior" and a "hole" in topology?

Should I warn my boss I might take sick leave?

Shipped package arrived - didn't order, possible scam?

Question about targeting a Hexproof creature

What happens if the limit of 4 billion files was exceeded in an ext4 partition?

What can a novel do that film and TV cannot?

Should I cheat if the majority does it?

A positive integer functional equation

What causes a fastener to lock?

soda water first stored in refrigerator and then outside

Are "confidant" and "confident" homophones?

SQL Server - TRY/CATCH does not work in certain cases

Why did Super-VGA offer the 5:4 1280*1024 resolution?

What is exact meaning of “ich wäre gern”?

Way to see all encrypted fields in Salesforce?

Bypass with wrong cvv of debit card and getting OTP

Advice for making/keeping shredded chicken moist?

Sleepy tired vs physically tired



Modeling a periodic snapshot in data vault?


Data vault for NoSQLData Vault 2.0 in SQL ServerData Vault - How to select business keys?Data Vault in Redshift and ETL StrategyData vault model: what are hubs good for?Data Vault: difference between business, surrogate, retained keysData Vault Model and Lookup TablesTransaction Data in Data Vault ModelHow to implement a Role Play Dimension into Data VaultBest Practices for Vaulting EAV






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








0















One of our data sources sends a feed with an aggregate of data per day. A periodic snapshot. For example:



shop, day, sales
bobs socks, 2019-01-01, 45,
bobs socks, 2019-01-02, 50,
bobs socks, 2019-01-03, 10,
janes coats,2019-01-01, 500,
janes coats,2019-01-02, 55,
janes coats,2019-01-03, 100


I know of two ways to model this in a data vault raw vault:



Multi-Active Satellite



Here we allow each satellite to have multiple rows per hub key.



create table dbo.HubShop (
ShopName nvarchar(50) not null,
primary key pk_HubShop (ShopName)
)

create table dbo.SatDailyShopSales (
ShopName nvarchar(50) not null,
SalesDate date not null,
SalesAmount money not null,
LoadTimestamp datetime2(7) not null,

primary key pk_SatDailyShopSales (ShopName, SalesDate, LoadTimestamp)
)


This is easy to implement but we now have a bi-temporal element to the satellite.



Snapshot Hub



create table dbo.HubShop (
ShopName nvarchar(50) not null,
primary key pk_HubShop (ShopName)
)

create table dbo.HubSnapshot (
SalesDate date not null,
primary key pk_HubSnapshot (SalesDate)
)

create table dbo.LinkDailyShopSnapshot (
LinkDailyShopSnapshotHash binary(32) not null,
ShopName nvarchar(50) not null,
SalesDate date not null,

primary key pk_LinkDailyShopSnapshot (LinkDailyShopSnapshotHash)
)

create table dbo.SatDailyShopSales (
LinkDailyShopSnapshotHash binary(32) not null,

SalesAmount money not null,
LoadTimestamp datetime2(7) not null,

primary key pk_SatDailyShopSales (LinkDailyShopSnapshotHash, LoadTimestamp)
)


This second solution adds an extra hub which just stores a list of dates and a link for the intersection between date and shop.



The second solution feels cleaner but requires more joins.



Which is the correct model? Are there any better solutions?










share|improve this question




























    0















    One of our data sources sends a feed with an aggregate of data per day. A periodic snapshot. For example:



    shop, day, sales
    bobs socks, 2019-01-01, 45,
    bobs socks, 2019-01-02, 50,
    bobs socks, 2019-01-03, 10,
    janes coats,2019-01-01, 500,
    janes coats,2019-01-02, 55,
    janes coats,2019-01-03, 100


    I know of two ways to model this in a data vault raw vault:



    Multi-Active Satellite



    Here we allow each satellite to have multiple rows per hub key.



    create table dbo.HubShop (
    ShopName nvarchar(50) not null,
    primary key pk_HubShop (ShopName)
    )

    create table dbo.SatDailyShopSales (
    ShopName nvarchar(50) not null,
    SalesDate date not null,
    SalesAmount money not null,
    LoadTimestamp datetime2(7) not null,

    primary key pk_SatDailyShopSales (ShopName, SalesDate, LoadTimestamp)
    )


    This is easy to implement but we now have a bi-temporal element to the satellite.



    Snapshot Hub



    create table dbo.HubShop (
    ShopName nvarchar(50) not null,
    primary key pk_HubShop (ShopName)
    )

    create table dbo.HubSnapshot (
    SalesDate date not null,
    primary key pk_HubSnapshot (SalesDate)
    )

    create table dbo.LinkDailyShopSnapshot (
    LinkDailyShopSnapshotHash binary(32) not null,
    ShopName nvarchar(50) not null,
    SalesDate date not null,

    primary key pk_LinkDailyShopSnapshot (LinkDailyShopSnapshotHash)
    )

    create table dbo.SatDailyShopSales (
    LinkDailyShopSnapshotHash binary(32) not null,

    SalesAmount money not null,
    LoadTimestamp datetime2(7) not null,

    primary key pk_SatDailyShopSales (LinkDailyShopSnapshotHash, LoadTimestamp)
    )


    This second solution adds an extra hub which just stores a list of dates and a link for the intersection between date and shop.



    The second solution feels cleaner but requires more joins.



    Which is the correct model? Are there any better solutions?










    share|improve this question
























      0












      0








      0








      One of our data sources sends a feed with an aggregate of data per day. A periodic snapshot. For example:



      shop, day, sales
      bobs socks, 2019-01-01, 45,
      bobs socks, 2019-01-02, 50,
      bobs socks, 2019-01-03, 10,
      janes coats,2019-01-01, 500,
      janes coats,2019-01-02, 55,
      janes coats,2019-01-03, 100


      I know of two ways to model this in a data vault raw vault:



      Multi-Active Satellite



      Here we allow each satellite to have multiple rows per hub key.



      create table dbo.HubShop (
      ShopName nvarchar(50) not null,
      primary key pk_HubShop (ShopName)
      )

      create table dbo.SatDailyShopSales (
      ShopName nvarchar(50) not null,
      SalesDate date not null,
      SalesAmount money not null,
      LoadTimestamp datetime2(7) not null,

      primary key pk_SatDailyShopSales (ShopName, SalesDate, LoadTimestamp)
      )


      This is easy to implement but we now have a bi-temporal element to the satellite.



      Snapshot Hub



      create table dbo.HubShop (
      ShopName nvarchar(50) not null,
      primary key pk_HubShop (ShopName)
      )

      create table dbo.HubSnapshot (
      SalesDate date not null,
      primary key pk_HubSnapshot (SalesDate)
      )

      create table dbo.LinkDailyShopSnapshot (
      LinkDailyShopSnapshotHash binary(32) not null,
      ShopName nvarchar(50) not null,
      SalesDate date not null,

      primary key pk_LinkDailyShopSnapshot (LinkDailyShopSnapshotHash)
      )

      create table dbo.SatDailyShopSales (
      LinkDailyShopSnapshotHash binary(32) not null,

      SalesAmount money not null,
      LoadTimestamp datetime2(7) not null,

      primary key pk_SatDailyShopSales (LinkDailyShopSnapshotHash, LoadTimestamp)
      )


      This second solution adds an extra hub which just stores a list of dates and a link for the intersection between date and shop.



      The second solution feels cleaner but requires more joins.



      Which is the correct model? Are there any better solutions?










      share|improve this question














      One of our data sources sends a feed with an aggregate of data per day. A periodic snapshot. For example:



      shop, day, sales
      bobs socks, 2019-01-01, 45,
      bobs socks, 2019-01-02, 50,
      bobs socks, 2019-01-03, 10,
      janes coats,2019-01-01, 500,
      janes coats,2019-01-02, 55,
      janes coats,2019-01-03, 100


      I know of two ways to model this in a data vault raw vault:



      Multi-Active Satellite



      Here we allow each satellite to have multiple rows per hub key.



      create table dbo.HubShop (
      ShopName nvarchar(50) not null,
      primary key pk_HubShop (ShopName)
      )

      create table dbo.SatDailyShopSales (
      ShopName nvarchar(50) not null,
      SalesDate date not null,
      SalesAmount money not null,
      LoadTimestamp datetime2(7) not null,

      primary key pk_SatDailyShopSales (ShopName, SalesDate, LoadTimestamp)
      )


      This is easy to implement but we now have a bi-temporal element to the satellite.



      Snapshot Hub



      create table dbo.HubShop (
      ShopName nvarchar(50) not null,
      primary key pk_HubShop (ShopName)
      )

      create table dbo.HubSnapshot (
      SalesDate date not null,
      primary key pk_HubSnapshot (SalesDate)
      )

      create table dbo.LinkDailyShopSnapshot (
      LinkDailyShopSnapshotHash binary(32) not null,
      ShopName nvarchar(50) not null,
      SalesDate date not null,

      primary key pk_LinkDailyShopSnapshot (LinkDailyShopSnapshotHash)
      )

      create table dbo.SatDailyShopSales (
      LinkDailyShopSnapshotHash binary(32) not null,

      SalesAmount money not null,
      LoadTimestamp datetime2(7) not null,

      primary key pk_SatDailyShopSales (LinkDailyShopSnapshotHash, LoadTimestamp)
      )


      This second solution adds an extra hub which just stores a list of dates and a link for the intersection between date and shop.



      The second solution feels cleaner but requires more joins.



      Which is the correct model? Are there any better solutions?







      data-vault






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 25 at 19:50









      JamesJames

      3573 silver badges13 bronze badges




      3573 silver badges13 bronze badges






















          1 Answer
          1






          active

          oldest

          votes


















          0














          as far as my understanding of the Data Vault modelling approach goes the Satellites are there to store the accurate time-slices of your data-warehouse.
          This means that if i am given a specific date and i select all hubs, links (with no or enddate <= specific date). And then their corresponding entry with max(loaddate) & loaddate <= specific date, i should have the full representation of the current real world data state.



          Applied to your question this means that your second solution fits these requirements. Because you can still import "changes" in the source system as new time slices, therefore modeling the correct timeline of information in the dwh.



          To formulate it as an example, lets say you source system has the state:



          shop, day, sales
          bobs socks, 2019-01-01, 45,
          bobs socks, 2019-01-02, 50,
          bobs socks, 2019-01-03, 10,
          janes coats,2019-01-01, 500,
          janes coats,2019-01-02, 55,
          janes coats,2019-01-03, 100


          and you import this data on 2019-01-03 23:30:00.
          On Jannuary the 4th 12:10:00 though "janes couts" salesteam corrects the numbers to only 90 sales.
          In your first solution this leaves you with updating the satellite entry with hub key "janes coats" and loaddate "2019-01-03" to 90 effectively loosing your accurate dwh history.



          so your DWH only stores the following afterwards:



          shop, day, sales
          bobs socks, 2019-01-01, 45,
          bobs socks, 2019-01-02, 50,
          bobs socks, 2019-01-03, 10,
          janes coats,2019-01-01, 500,
          janes coats,2019-01-02, 55,
          janes coats,2019-01-03, 90


          whereas in your second solution you simply insert a new satellite timeslice for store snapshot hash (for business key "janes coats" with date"2019-01-03") with loaddate "2019-01-03 12:10:00" and sales 90.



          LINK
          shop, day, ID (think of ID as a hash)
          bobs socks, 2019-01-01, 1
          bobs socks, 2019-01-02, 2
          bobs socks, 2019-01-03, 3
          janes coats,2019-01-01, 4
          janes coats,2019-01-02, 5
          janes coats,2019-01-03, 6

          SALES Satellite
          Link ID, loaddate, sales
          1, 2019-01-03 23:30:00, 45
          2, 2019-01-03 23:30:00, 50
          3, 2019-01-03 23:30:00, 10
          4, 2019-01-03 23:30:00, 500
          5, 2019-01-03 23:30:00, 55
          6, 2019-01-03 23:30:00, 100 !
          6, 2019-01-04 12:10:00, 90 !


          So you can easily see in your system that you got the correction of sales numbers at 2019-01-04 12:10:00 and that they were 100 before that.



          The way I think of it is the only allowed update action in the Data Vault model is setting an EndDate in a Link Table and that deletes are never allowed. The you have a full DWH history available and reproduceable.






          share|improve this answer






















            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55345400%2fmodeling-a-periodic-snapshot-in-data-vault%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            as far as my understanding of the Data Vault modelling approach goes the Satellites are there to store the accurate time-slices of your data-warehouse.
            This means that if i am given a specific date and i select all hubs, links (with no or enddate <= specific date). And then their corresponding entry with max(loaddate) & loaddate <= specific date, i should have the full representation of the current real world data state.



            Applied to your question this means that your second solution fits these requirements. Because you can still import "changes" in the source system as new time slices, therefore modeling the correct timeline of information in the dwh.



            To formulate it as an example, lets say you source system has the state:



            shop, day, sales
            bobs socks, 2019-01-01, 45,
            bobs socks, 2019-01-02, 50,
            bobs socks, 2019-01-03, 10,
            janes coats,2019-01-01, 500,
            janes coats,2019-01-02, 55,
            janes coats,2019-01-03, 100


            and you import this data on 2019-01-03 23:30:00.
            On Jannuary the 4th 12:10:00 though "janes couts" salesteam corrects the numbers to only 90 sales.
            In your first solution this leaves you with updating the satellite entry with hub key "janes coats" and loaddate "2019-01-03" to 90 effectively loosing your accurate dwh history.



            so your DWH only stores the following afterwards:



            shop, day, sales
            bobs socks, 2019-01-01, 45,
            bobs socks, 2019-01-02, 50,
            bobs socks, 2019-01-03, 10,
            janes coats,2019-01-01, 500,
            janes coats,2019-01-02, 55,
            janes coats,2019-01-03, 90


            whereas in your second solution you simply insert a new satellite timeslice for store snapshot hash (for business key "janes coats" with date"2019-01-03") with loaddate "2019-01-03 12:10:00" and sales 90.



            LINK
            shop, day, ID (think of ID as a hash)
            bobs socks, 2019-01-01, 1
            bobs socks, 2019-01-02, 2
            bobs socks, 2019-01-03, 3
            janes coats,2019-01-01, 4
            janes coats,2019-01-02, 5
            janes coats,2019-01-03, 6

            SALES Satellite
            Link ID, loaddate, sales
            1, 2019-01-03 23:30:00, 45
            2, 2019-01-03 23:30:00, 50
            3, 2019-01-03 23:30:00, 10
            4, 2019-01-03 23:30:00, 500
            5, 2019-01-03 23:30:00, 55
            6, 2019-01-03 23:30:00, 100 !
            6, 2019-01-04 12:10:00, 90 !


            So you can easily see in your system that you got the correction of sales numbers at 2019-01-04 12:10:00 and that they were 100 before that.



            The way I think of it is the only allowed update action in the Data Vault model is setting an EndDate in a Link Table and that deletes are never allowed. The you have a full DWH history available and reproduceable.






            share|improve this answer



























              0














              as far as my understanding of the Data Vault modelling approach goes the Satellites are there to store the accurate time-slices of your data-warehouse.
              This means that if i am given a specific date and i select all hubs, links (with no or enddate <= specific date). And then their corresponding entry with max(loaddate) & loaddate <= specific date, i should have the full representation of the current real world data state.



              Applied to your question this means that your second solution fits these requirements. Because you can still import "changes" in the source system as new time slices, therefore modeling the correct timeline of information in the dwh.



              To formulate it as an example, lets say you source system has the state:



              shop, day, sales
              bobs socks, 2019-01-01, 45,
              bobs socks, 2019-01-02, 50,
              bobs socks, 2019-01-03, 10,
              janes coats,2019-01-01, 500,
              janes coats,2019-01-02, 55,
              janes coats,2019-01-03, 100


              and you import this data on 2019-01-03 23:30:00.
              On Jannuary the 4th 12:10:00 though "janes couts" salesteam corrects the numbers to only 90 sales.
              In your first solution this leaves you with updating the satellite entry with hub key "janes coats" and loaddate "2019-01-03" to 90 effectively loosing your accurate dwh history.



              so your DWH only stores the following afterwards:



              shop, day, sales
              bobs socks, 2019-01-01, 45,
              bobs socks, 2019-01-02, 50,
              bobs socks, 2019-01-03, 10,
              janes coats,2019-01-01, 500,
              janes coats,2019-01-02, 55,
              janes coats,2019-01-03, 90


              whereas in your second solution you simply insert a new satellite timeslice for store snapshot hash (for business key "janes coats" with date"2019-01-03") with loaddate "2019-01-03 12:10:00" and sales 90.



              LINK
              shop, day, ID (think of ID as a hash)
              bobs socks, 2019-01-01, 1
              bobs socks, 2019-01-02, 2
              bobs socks, 2019-01-03, 3
              janes coats,2019-01-01, 4
              janes coats,2019-01-02, 5
              janes coats,2019-01-03, 6

              SALES Satellite
              Link ID, loaddate, sales
              1, 2019-01-03 23:30:00, 45
              2, 2019-01-03 23:30:00, 50
              3, 2019-01-03 23:30:00, 10
              4, 2019-01-03 23:30:00, 500
              5, 2019-01-03 23:30:00, 55
              6, 2019-01-03 23:30:00, 100 !
              6, 2019-01-04 12:10:00, 90 !


              So you can easily see in your system that you got the correction of sales numbers at 2019-01-04 12:10:00 and that they were 100 before that.



              The way I think of it is the only allowed update action in the Data Vault model is setting an EndDate in a Link Table and that deletes are never allowed. The you have a full DWH history available and reproduceable.






              share|improve this answer

























                0












                0








                0







                as far as my understanding of the Data Vault modelling approach goes the Satellites are there to store the accurate time-slices of your data-warehouse.
                This means that if i am given a specific date and i select all hubs, links (with no or enddate <= specific date). And then their corresponding entry with max(loaddate) & loaddate <= specific date, i should have the full representation of the current real world data state.



                Applied to your question this means that your second solution fits these requirements. Because you can still import "changes" in the source system as new time slices, therefore modeling the correct timeline of information in the dwh.



                To formulate it as an example, lets say you source system has the state:



                shop, day, sales
                bobs socks, 2019-01-01, 45,
                bobs socks, 2019-01-02, 50,
                bobs socks, 2019-01-03, 10,
                janes coats,2019-01-01, 500,
                janes coats,2019-01-02, 55,
                janes coats,2019-01-03, 100


                and you import this data on 2019-01-03 23:30:00.
                On Jannuary the 4th 12:10:00 though "janes couts" salesteam corrects the numbers to only 90 sales.
                In your first solution this leaves you with updating the satellite entry with hub key "janes coats" and loaddate "2019-01-03" to 90 effectively loosing your accurate dwh history.



                so your DWH only stores the following afterwards:



                shop, day, sales
                bobs socks, 2019-01-01, 45,
                bobs socks, 2019-01-02, 50,
                bobs socks, 2019-01-03, 10,
                janes coats,2019-01-01, 500,
                janes coats,2019-01-02, 55,
                janes coats,2019-01-03, 90


                whereas in your second solution you simply insert a new satellite timeslice for store snapshot hash (for business key "janes coats" with date"2019-01-03") with loaddate "2019-01-03 12:10:00" and sales 90.



                LINK
                shop, day, ID (think of ID as a hash)
                bobs socks, 2019-01-01, 1
                bobs socks, 2019-01-02, 2
                bobs socks, 2019-01-03, 3
                janes coats,2019-01-01, 4
                janes coats,2019-01-02, 5
                janes coats,2019-01-03, 6

                SALES Satellite
                Link ID, loaddate, sales
                1, 2019-01-03 23:30:00, 45
                2, 2019-01-03 23:30:00, 50
                3, 2019-01-03 23:30:00, 10
                4, 2019-01-03 23:30:00, 500
                5, 2019-01-03 23:30:00, 55
                6, 2019-01-03 23:30:00, 100 !
                6, 2019-01-04 12:10:00, 90 !


                So you can easily see in your system that you got the correction of sales numbers at 2019-01-04 12:10:00 and that they were 100 before that.



                The way I think of it is the only allowed update action in the Data Vault model is setting an EndDate in a Link Table and that deletes are never allowed. The you have a full DWH history available and reproduceable.






                share|improve this answer













                as far as my understanding of the Data Vault modelling approach goes the Satellites are there to store the accurate time-slices of your data-warehouse.
                This means that if i am given a specific date and i select all hubs, links (with no or enddate <= specific date). And then their corresponding entry with max(loaddate) & loaddate <= specific date, i should have the full representation of the current real world data state.



                Applied to your question this means that your second solution fits these requirements. Because you can still import "changes" in the source system as new time slices, therefore modeling the correct timeline of information in the dwh.



                To formulate it as an example, lets say you source system has the state:



                shop, day, sales
                bobs socks, 2019-01-01, 45,
                bobs socks, 2019-01-02, 50,
                bobs socks, 2019-01-03, 10,
                janes coats,2019-01-01, 500,
                janes coats,2019-01-02, 55,
                janes coats,2019-01-03, 100


                and you import this data on 2019-01-03 23:30:00.
                On Jannuary the 4th 12:10:00 though "janes couts" salesteam corrects the numbers to only 90 sales.
                In your first solution this leaves you with updating the satellite entry with hub key "janes coats" and loaddate "2019-01-03" to 90 effectively loosing your accurate dwh history.



                so your DWH only stores the following afterwards:



                shop, day, sales
                bobs socks, 2019-01-01, 45,
                bobs socks, 2019-01-02, 50,
                bobs socks, 2019-01-03, 10,
                janes coats,2019-01-01, 500,
                janes coats,2019-01-02, 55,
                janes coats,2019-01-03, 90


                whereas in your second solution you simply insert a new satellite timeslice for store snapshot hash (for business key "janes coats" with date"2019-01-03") with loaddate "2019-01-03 12:10:00" and sales 90.



                LINK
                shop, day, ID (think of ID as a hash)
                bobs socks, 2019-01-01, 1
                bobs socks, 2019-01-02, 2
                bobs socks, 2019-01-03, 3
                janes coats,2019-01-01, 4
                janes coats,2019-01-02, 5
                janes coats,2019-01-03, 6

                SALES Satellite
                Link ID, loaddate, sales
                1, 2019-01-03 23:30:00, 45
                2, 2019-01-03 23:30:00, 50
                3, 2019-01-03 23:30:00, 10
                4, 2019-01-03 23:30:00, 500
                5, 2019-01-03 23:30:00, 55
                6, 2019-01-03 23:30:00, 100 !
                6, 2019-01-04 12:10:00, 90 !


                So you can easily see in your system that you got the correction of sales numbers at 2019-01-04 12:10:00 and that they were 100 before that.



                The way I think of it is the only allowed update action in the Data Vault model is setting an EndDate in a Link Table and that deletes are never allowed. The you have a full DWH history available and reproduceable.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Apr 1 at 12:02









                FlorianBFlorianB

                113 bronze badges




                113 bronze badges
















                    Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.







                    Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.



















                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55345400%2fmodeling-a-periodic-snapshot-in-data-vault%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

                    SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

                    은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현