Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: googleapis/nodejs-datastore
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v8.7.0
Choose a base ref
...
head repository: googleapis/nodejs-datastore
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v9.0.0
Choose a head ref
  • 5 commits
  • 18 files changed
  • 4 contributors

Commits on Apr 2, 2024

  1. test: sort results, check for sorted order (#1245)

    * sort results, check for sorted order
    
    * test: sort results, check for sorted order
    danieljbruce authored Apr 2, 2024
    Configuration menu
    Copy the full SHA
    10f85fd View commit details
    Browse the repository at this point in the history

Commits on Apr 4, 2024

  1. fix: read time should be used for transaction reads (#1171)

    * Allow datastore projectId to be fetched from clien
    
    Latency is caused by the call to getProjectId from Google auth. This change allows the project id to be retrieved if it is set in the client at creation time thereby reducing call latency.
    
    * Create a file for mocking out commits
    
    A test file is created where we mock out commit in the Gapic layer. The mock allows us to get the results passed to the commit endpoint in the Gapic layer.
    
    * Create a test to measure latency of call.
    
    To prove that the change works to reduce latency, a test is written. The test checks to see that the amount of time that passes between the time when the initial call is made in the user’s code and the time when the call reaches the gapic layer is sufficiently small. It will be a very small amount of time if the program does not need to do an auth lookup.
    
    * Run the linting fixes
    
    Run the linter so that spacing in the PR gets fixed for some of the lines of code.
    
    * Add license header to top of test file
    
    The license header needs to be added to the top of the new test file that is used for mocking out commit.
    
    * Add comment for test for now
    
    This is going to be a test for investigating the latency of the client.
    
    * Add a test for the mock server
    
    Measure the latency between the original call and the mock server.
    
    * Set current retry attempt to 0
    
    * Add a log for call time
    
    Do check external to function after async call. Add log for call time.
    
    * Add another mock
    
    Other mock doesn’t require lazy client initialization.
    
    * Eliminate code from the mock file
    
    Eliminate the fake datastore client because we need to do assertion checks that are specific to each test. This means there is no point in defining runQuery once in a mock because each test will mock it out differently.
    
    * Start off by adding read time to read options
    
    Add the code change that will add read time to read options for transactions.
    
    # Conflicts:
    #	test/transaction.ts
    
    * Update the test to use transactions
    
    The idea is to test that read time got passed along for transactions specifically. This will be necessary for snapshot reads to work.
    
    * Remove only
    
    Need the entire test suite to run
    
    * Remove the before hook
    
    The before hook is not necessary. Just mock out the data client at the start.
    
    * Remove unnecessary cherry picked files
    
    Files were cherry-picked that weren’t helpful for solving the problem. Remove them.
    
    * Clean up PR diff
    
    * clean up PR diff
    
    * Update the test so that it is run as a transaction
    
    Right now, providing a transaction id is necessary to run the request as a transaction.
    
    * Add an integration test
    
    The integration test looks at the data from the snapshot read time for transactions and ensures that the read has no data thereby exercising the read time parameter.
    
    * Linting fixing indents
    
    Fix the indents in the system test folder
    
    * Update the header
    
    * Fix unit test
    
    beginTransaction needs to be mocked out now that a transaction will begin if runQuery is called.
    
    * System test changes.
    
    Add a sleep. Instead of changing the current test, add a new test because it means the reader of the PR can be sure that test coverage wasn’t reduced which is better.
    
    * Modify test
    
    Modify the test so that sleeps are long enough to create predictable results and tests actually check for the right values.
    
    * Replace with less precise assert
    
    The test setup sometimes prepares before data with 0 entries and sometimes prepares before data with 1 entry so a less restrictive test is required in order for it to consistently pass.
    danieljbruce authored Apr 4, 2024
    Configuration menu
    Copy the full SHA
    73a0a39 View commit details
    Browse the repository at this point in the history

Commits on May 8, 2024

  1. fix!: An existing method UpdateVehicleLocation is removed from serv…

    …ice `VehicleService` (#1248)
    
    * fix!: An existing method `UpdateVehicleLocation` is removed from service `VehicleService`
    fix!: An existing method `SearchFuzzedVehicles` is removed from service `VehicleService`
    fix!: An existing message `UpdateVehicleLocationRequest` is removed
    
    PiperOrigin-RevId: 631557549
    
    Source-Link: googleapis/googleapis@3d50414
    
    Source-Link: googleapis/googleapis-gen@5ce63d4
    Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiNWNlNjNkNGU2MzZhOTc1MTc1YmRlMmQxNmMxNWU3MGRkNWE4MWZmNCJ9
    
    * 🦉 Updates from OwlBot post-processor
    
    See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md
    
    ---------
    
    Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
    gcf-owl-bot[bot] and gcf-owl-bot[bot] authored May 8, 2024
    Configuration menu
    Copy the full SHA
    ba79118 View commit details
    Browse the repository at this point in the history

Commits on May 9, 2024

  1. feat: Query profiling feature (#1221)

    * Add the right files from the preview branch
    
    Copy paste over the files from the preview branch
    
    * Thread Query mode through the library
    
    Thread the query mode through the client library. Allow calls to get made using the query mode.
    
    * Create a basic workflow for testing profiling grpc
    
    This code allows basic query profiling to work and creates a basic code snippet that can be used to make it work.
    
    * This test makes a call to the mock server
    
    This will be useful for debugging grpc issues.
    
    * Now the test works with the mock server in repo
    
    This mock server will be super useful for ensuring that the right data gets transmitted in private preview so that we can confirm the client library is working correctly.
    
    * Delete js file
    
    We use the ts file instead.
    
    * Try explain
    
    See if Explain gets passed along to the mock server.
    
    * Update protos.json
    
    Protos.json is needed to pass the plan along.
    
    * Try an aggregate query
    
    * Add test to catch breaking change to return type
    
    Make sure changes don’t break the use of this callback by adding this test.
    
    * Try redefining RunQueryCallback
    
    Define it with a more general data structure.
    
    * Revert Gapic changes to match head branch
    
    The merge caused generated files to have a diff. The diff should not be in the PR.
    
    * Remove only
    
    only should not be used here
    
    * Add data structures for return values
    
    Data structures need to be added for the return values that contain stats.
    
    * Add plumbing to send stats info back
    
    Add stats to info if it is available. If there are no results then end the stream and send the info back. This way, stats will always be sent in info if they are available and the program won ’t break if there are no results.
    
    * Set test to only that we intend to explore
    
    * Add a comment about stats
    
    Explain what happens when the result set is empty. Just send the stats back.
    
    * Delete the mock server code
    
    The mock server code was used for debugging and now we don’t need it since the service is working.
    
    * Remove calls to nightly
    
    Calls to nightly don’t have a second database to work with. Regular calls work now so nightly calls are not necessary.
    
    * Introduce profiling tests again
    
    Bring the query profiling tests back.
    
    * Revert "Remove calls to nightly"
    
    This reverts commit 040d0a5.
    
    * Stats are optional
    
    Stats do not necessarily come from the server.
    
    * Write some tests to test each mode
    
    Each query profiling mode needs to be explored.
    
    * Add code for parsing the stats returned
    
    Stats returned needs to be parsed by a special library that will remove the complexities of the Struct object.
    
    * Add dependencies necessary for parsing
    
    A library is needed for parsing the Struct values. Add the libraries and use them.
    
    * Add assertions for the expected plan/stats
    
    Expected plan and expected stats should be used in the tests. This ensures the tests check for the proper stats values.
    
    * Refactor info with stats build and add info to cb
    
    Add a specific type for the runAggregationQuery callback so that it can now support the info parameter. In order to allow runAggregationQuery to make use of creating info, we also refactor info construction into a separate function.
    
    * Modify the parser for runAggregationQuery
    
    The parser for runAggregationQuery should have a deterministic process for computing results.
    
    * Add asserts for the return values of runAggregate
    
    Make sure that the entities and the plan are correct.
    
    * Complete tests for runQuery and aggregation query
    
    The assertion checks for runQuery and runAggregationQuery should be done and they test each mode.
    
    * Add tests for Query and AggregateQuery
    
    Tests should ensure run functions work properly for the run function on both of these objects.
    
    * Add initial transaction tests for runQuery
    
    The runQuery tests have been added which get the right results. Next the right assert statements and info collection will be added to the tests.
    
    * Add checks on info to the transaction tests
    
    Checks against info are needed to be sure that stats are passed back to the caller properly.
    
    * Fix tests for aggregate queries in transactions
    
    Add tests for running aggregate queries inside transactions.
    
    * Ran linter, added all tests for runQueryStream
    
    Added a temporary test for runQueryStream. Also ran the linter.
    
    * Change parsing of return values
    
    Return values are going to look different for users. Change the code so that the parsing is done differently.
    
    * Reformat the info function
    
    This function is more readable if we eliminate some of the tertiary arguments and complex logic for building the info object.
    
    * Change tests as a result of structure changes
    
    The structure of the QueryInfo object is changed. Modify the tests to work with the new structure.
    
    * Use import and not require
    
    import is better for catching compile time errors and is more customary.
    
    * Better spacing for imports
    
    Change the spacing so that the imports are all in one place.
    
    * Introduce a single function for checking the execution stats. Make sure all the tests use this function. Pull out the run query plan and the run aggregation query plan.
    
    * Fix the tests so that they call the right fns
    
    Add assertion checks to check the query plan against some expected value and make sure the right assertion checks are done for the right tests.
    
    * Finish the tests for the streaming call
    
    Finish the tests for specifying no mode, specifying normal mode, EXPLAIN mode and EXPLAIN_ANALYZE mode. Make sure the tests  pass.
    
    * Delete code that will not be used anymore
    
    There is a lot of boilerplate code that was needed for the streaming call. Get rid of it here.
    
    * Make changes to match new proto
    
    Code change to use new proto was made so that code will compile.
    
    * Add Explain Metrics interface
    
    Make slight change to withBeginTransaction so that code compiles under new structure. Also group plan and statistics under the new explainMetrics interface.
    
    * Remove bytesReturned from test
    
    Proto should not be passing along bytesReturned anymore.
    
    * Fix system tests to use values matching new struct
    
    * Remove calls to nightly
    
    Feature is now fully ready so nightly tests should not be done because the feature is expected to work in production.
    
    * Query profiling
    
    Add test for runQuery. Send back plan summary and execution stats.
    
    * Add a test for runAggregationQuery
    
    runAggregationQuery needs a unit test to validate request/return data.
    
    * Parameterize the query profiling tests
    
    * run the linter
    
    * Export Query Mode in index.ts
    
    Query mode needs to be exported so that it can be accessed by the user.
    
    * Change data structure types to match return values
    
    * Remove TODO
    
    * remove import
    
    * delete the query profiling samples
    
    * Remove abstraction for RunQueryCallback
    
    * Change the comment to describe new data types
    
    * Remove TODO
    
    * linting fixes
    
    * Update type to include runAggregationQuery
    
    * Put else back in
    
    This change is actually simpler because it doesn’t introduce a let. It is also a much smaller diff.
    
    * mode is not needed in sharedQueryOptions
    
    * Revert "mode is not needed in sharedQueryOptions"
    
    This reverts commit b8d0c63.
    
    * Rearrange imports
    
    rearrange the imports to simplify the diff.
    
    * Revert imports to simplify diff
    
    * Don’t change Entities position
    
    Simplify diff
    
    * Move timestamp import back
    
    * This interface is only needed once
    
    Define the interface inline
    
    * Remove QueryMode and replace with explainOptions
    
    * A few system tests
    
    Add a few tests for the different explain options cases.
    
    * Add a few tests for the false case
    
    * Add more specific types to introduced function
    
    * mode parameter is no longer required
    
    * This signature change is no longer required
    
    * Update the comment for getInfoFromStats
    
    * GapicExplainOptions are no longer needed.
    
    * Set analyze to false to match description
    
    * Add a test for runQueryStream and analyze set fals
    
    * Add a test for analyze set to false
    
    * Import ExplainOptions
    
    * Remove bytesReturned from the interface
    
    * Make types in test more specific
    
    * name as string
    
    * Rely on 2 dependencies from google-gax instead
    
    * Change expectations in the test to reflect new val
    
    * Ran linter
    
    * Remove extraneous import
    
    * Modify stats return type
    danieljbruce authored May 9, 2024
    Configuration menu
    Copy the full SHA
    414dec4 View commit details
    Browse the repository at this point in the history

Commits on May 27, 2024

  1. chore(main): release 9.0.0 (#1247)

    Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
    release-please[bot] authored May 27, 2024
    Configuration menu
    Copy the full SHA
    32329d2 View commit details
    Browse the repository at this point in the history
Loading