Skip to content

Sliding stops updates#231

Open
carolineychen8 wants to merge 31 commits intomainfrom
caroline-sliding-notebook
Open

Sliding stops updates#231
carolineychen8 wants to merge 31 commits intomainfrom
caroline-sliding-notebook

Conversation

@carolineychen8
Copy link
Contributor

@carolineychen8 carolineychen8 commented Feb 12, 2026

Some code changes to be made, but created PR to track commits

Copy link
Contributor

@andresmondragont andresmondragont left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same test error as #216 and new test error:
FAILED nomad/tests/test_stop_detection.py::test_sequential_per_user_basic - pandas.errors.InvalidIndexError: Reindexing only valid with uniquely valued Index objects

___________________________________________________ test_sequential_per_user_basic ____________________________________________________

base_df =                    uid   timestamp  tz_offset  longitude   latitude             local_datetime             x          ...   3600  38.320389 -36.667515  2024-01-15 13:39:00+01:00  4.265806e+06 -4.392863e+06  kc7r2hb

[26977 rows x 9 columns]

    def test_sequential_per_user_basic(base_df):
        """Test detect_stops_per_user on multi-user data."""
        traj_cols = {
            "user_id": "uid", "timestamp": "timestamp",
            "x": "x", "y": "y"
        }
        df = loader.from_df(base_df, traj_cols=traj_cols, parse_dates=True, mixed_timezone_behavior="utc")
    
>       stops_df = SEQUENTIAL.detect_stops_per_user(
            data=df,
            delta_roam=100,
            dt_max=10,
            dur_min=5,
            method='sliding',
            traj_cols=traj_cols,
            complete_output=False,
            n_jobs=1
        )

nomad/tests/test_stop_detection.py:402: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
nomad/stop_detection/sequential.py:394: in detect_stops_per_user
    return pd.concat(results, ignore_index=True)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.13/site-packages/pandas/core/reshape/concat.py:466: in concat
    result = _get_result(
.venv/lib/python3.13/site-packages/pandas/core/reshape/concat.py:653: in _get_result
    indexers[ax] = obj_labels.get_indexer(new_labels)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = Index(['x', 'y', 'timestamp', 'duration', 'uid', 'uid'], dtype='str')
target = Index(['cluster', 'x', 'y', 'timestamp', 'duration', 'uid'], dtype='str'), method = None, limit = None, tolerance = None

    @final
    def get_indexer(
        self,
        target,
        method: ReindexMethod | None = None,
        limit: int | None = None,
        tolerance=None,
    ) -> npt.NDArray[np.intp]:
        """
        Compute indexer and mask for new index given the current index.
    
        The indexer should be then used as an input to ndarray.take to align the
        current data to the new index.
    
        Parameters
        ----------
        target : Index
            An iterable containing the values to be used for computing indexer.
        method : {None, 'pad'/'ffill', 'backfill'/'bfill', 'nearest'}, optional
            * default: exact matches only.
            * pad / ffill: find the PREVIOUS index value if no exact match.
            * backfill / bfill: use NEXT index value if no exact match
            * nearest: use the NEAREST index value if no exact match. Tied
              distances are broken by preferring the larger index value.
        limit : int, optional
            Maximum number of consecutive labels in ``target`` to match for
            inexact matches.
        tolerance : optional
            Maximum distance between original and new labels for inexact
            matches. The values of the index at the matching locations must
            satisfy the equation ``abs(index[indexer] - target) <= tolerance``.
    
            Tolerance may be a scalar value, which applies the same tolerance
            to all values, or list-like, which applies variable tolerance per
            element. List-like includes list, tuple, array, Series, and must be
            the same size as the index and its dtype must exactly match the
            index's type.
    
        Returns
        -------
        np.ndarray[np.intp]
            Integers from 0 to n - 1 indicating that the index at these
            positions matches the corresponding target values. Missing values
            in the target are marked by -1.
    
        See Also
        --------
        Index.get_indexer_for : Returns an indexer even when non-unique.
        Index.get_non_unique : Returns indexer and masks for new index given
            the current index.
    
        Notes
        -----
        Returns -1 for unmatched values, for further explanation see the
        example below.
    
        Examples
        --------
        >>> index = pd.Index(["c", "a", "b"])
        >>> index.get_indexer(["a", "b", "x"])
        array([ 1,  2, -1])
    
        Notice that the return value is an array of locations in ``index``
        and ``x`` is marked by -1, as it is not in ``index``.
        """
        method = clean_reindex_fill_method(method)
        orig_target = target
        target = self._maybe_cast_listlike_indexer(target)
    
        self._check_indexing_method(method, limit, tolerance)
    
        if not self._index_as_unique:
>           raise InvalidIndexError(self._requires_unique_msg)
E           pandas.errors.InvalidIndexError: Reindexing only valid with uniquely valued Index objects

.venv/lib/python3.13/site-packages/pandas/core/indexes/base.py:3728: InvalidIndexError

@andresmondragont andresmondragont added bug Something isn't working and removed bug Something isn't working labels Mar 1, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants