Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
99 commits
Select commit Hold shift + click to select a range
105f9b6
Added version of pr-check for running on della
davidt0x Sep 3, 2021
2d25298
Added example notebook tests
davidt0x Jan 24, 2022
0e0b5cd
Disable notebook building from docs for now.
davidt0x Jan 24, 2022
2f70417
Fix some issues with notebook tests
davidt0x Feb 7, 2022
c63caab
increase job time on della tests
davidt0x Feb 7, 2022
015ce02
Add log message for running notebook tests
davidt0x Feb 7, 2022
5b92e42
more fixes
davidt0x Feb 7, 2022
e0a745b
Fixes some bugs on della's new head node
davidt0x Feb 21, 2022
b283f0a
fix some style errors
davidt0x Feb 21, 2022
e5d68ea
Added type: ignore for pkg_resources
davidt0x Feb 21, 2022
fa864b2
Get rid of install of types-pkg-resources
davidt0x Mar 7, 2022
bc78098
Remove numpy pin
davidt0x Mar 7, 2022
e179634
replace some references to np.bool with bool
davidt0x Mar 7, 2022
014e0b4
fix type error in test_image.py:129
davidt0x Mar 7, 2022
c4a5494
Fix syntax error in pr-check.sh
davidt0x Mar 7, 2022
0cb0adc
workaround for newer numpy and theano
davidt0x Mar 21, 2022
d30b4e5
pin statsmodel<=0.12
davidt0x Mar 21, 2022
9e06236
Convert comparison to np.allclose
davidt0x Mar 21, 2022
e6fceaf
fix the statsmodel issue with deprecated ARMA
davidt0x Mar 21, 2022
fadc914
fixed deprectaion warning for ndimage
davidt0x Mar 21, 2022
12921d6
swap np.matlib.repmat for np.tile becuase of deprecation
davidt0x Mar 21, 2022
fb5647b
fix ref to moved example iem_example_synthetic_RF_data
davidt0x Mar 21, 2022
0930055
add python 3.9 and 3.10 testing, add testbook req
davidt0x Mar 21, 2022
83a170a
fix python version specifiers to string in ci
davidt0x Mar 21, 2022
9a6e1f4
Attempted fixes for conda build and doc generation
davidt0x Apr 4, 2022
f9a992b
Convert embedded image in notebook to non-embedded.
davidt0x Apr 4, 2022
2eb5ff8
More warnings fixed on notebooks.
davidt0x Apr 4, 2022
98a5464
small change to test CI on master.
davidt0x Apr 4, 2022
52719e7
Update main.yml
davidt0x Apr 4, 2022
fe8788b
Some packaging changes
davidt0x Jul 25, 2022
cf30127
Add version cap for pymanopt, new version breaks things.
davidt0x Jul 25, 2022
243efdc
Remove typing-extensions and dataclasses dependencies. Not needed.
davidt0x Jul 25, 2022
709035e
Fix some numpy type decprecation warnings.
davidt0x Jul 25, 2022
53078a0
Fix some code style errors.
davidt0x Aug 8, 2022
d623422
More code style fixes
davidt0x Aug 8, 2022
5cbcd64
More codestyle fixes.
davidt0x Aug 8, 2022
85cec10
More code style fixes in tests
davidt0x Aug 8, 2022
41f9a2c
More style fixes in tests
davidt0x Aug 8, 2022
6d734dc
Fix exclusion of pymanopt in conda build
davidt0x Aug 8, 2022
372fff0
Specify engilish language for docs
davidt0x Aug 8, 2022
d3be6e4
Change 'jupyter_execute_notebooks' to 'nb_execution_mode'.
davidt0x Aug 8, 2022
fe8755e
Fix Myst-NB config deprecation
mihaic Aug 8, 2022
230d61e
Another Myst-NB config deprecation fix
mihaic Aug 9, 2022
7c20622
Suppress Myst-NB Holoviews warning
mihaic Aug 9, 2022
02ff296
Fix Myst warnings
mihaic Aug 10, 2022
d7e9e8a
Add setup requires dependencies to meta.yaml host
davidt0x Sep 19, 2022
259f607
Merge branch 'notebook_testing' of https://github.com/davidt0x/braini…
davidt0x Sep 19, 2022
a242774
Add use_scm_version=True to setup.cfg
davidt0x Sep 19, 2022
177bd52
Fix syntax for setting mypy github repo as dep
davidt0x Sep 19, 2022
bcc2e87
Pin pymanopt install in .conda/buiild.sh like in install_requires
davidt0x Sep 19, 2022
c795813
fix pymanopt pin
davidt0x Sep 19, 2022
aa97190
Remove mypy workaround
mihaic Sep 26, 2022
16f067b
Mark rtcloud and MPI notebooks as skip
davidt0x Oct 17, 2022
6ceed5f
Merge branch 'notebook_testing' of https://github.com/davidt0x/braini…
davidt0x Oct 17, 2022
219880d
Disable matrix-normal notebook test.
davidt0x Oct 31, 2022
ace6f6a
Remove code for checking "std=c++11" flags
davidt0x Oct 31, 2022
725f799
Remove extra parentheses.
davidt0x Oct 31, 2022
46223d3
Fix indent error
davidt0x Oct 31, 2022
ce10f21
Fix style
davidt0x Oct 31, 2022
dc621ad
Fix clang paths
davidt0x Oct 31, 2022
cfc8dfb
Fix clang path to be version 14
davidt0x Oct 31, 2022
227eeeb
Fix syntax error
davidt0x Oct 31, 2022
89e95d6
Fix syntax error and black format
davidt0x Nov 1, 2022
74ed4f3
Run black with 80 for line length limit.
davidt0x Nov 1, 2022
7c200d2
type: ignore to line in test_notebooks.py
davidt0x Nov 1, 2022
3a29dba
Fix some mypy issues.
davidt0x Nov 1, 2022
7c26a22
Add new workflow for running notebook tests
davidt0x Mar 6, 2023
32828e6
Dont setup python on della, use anaconda module.
davidt0x Mar 6, 2023
35f0f3c
Fix reference to llvm@14
davidt0x Mar 20, 2023
f6a638a
Fix deprecated call to get_data()
davidt0x Mar 20, 2023
60efa07
Rename workflow step.
davidt0x Mar 20, 2023
e50a432
Some changes to pr-check.sh for notebook tests
davidt0x Mar 20, 2023
212ddb3
Merge branch 'master' of https://github.com/davidt0x/brainiak into no…
davidt0x May 15, 2023
b151709
Convert calls from get_data() to get_fdatat()
davidt0x May 15, 2023
b7d4f54
Disable SRM notebook test for now.
davidt0x May 16, 2023
9a66139
Add dtype=float for Nifti1Pair image creation.
davidt0x May 16, 2023
7798442
Another get_data() to get_fdata() fix.
davidt0x May 16, 2023
254f6fc
Disable doc build on della
davidt0x May 16, 2023
408c4cb
Set IGNORE_CONDA false on della
davidt0x May 16, 2023
07a061f
Delete IGNOR_CONDA
davidt0x May 16, 2023
aa55f15
Change license_file to license_files
davidt0x May 16, 2023
5d3a349
Fix to find clang on macos
davidt0x May 16, 2023
a7135ae
Add concurrency constraints to workflows
davidt0x May 16, 2023
c749662
Codestyle fix
davidt0x May 16, 2023
1cffdc9
Install OpenMP for macos.
davidt0x May 16, 2023
cab7f57
Add include and lib dirs for omp
davidt0x May 16, 2023
834c41d
Bunch of mypy fixes
davidt0x May 17, 2023
ca755ed
Remove install of omp on MacOS.
davidt0x May 17, 2023
0ec46dd
Added a some documentation
davidt0x Aug 7, 2023
2f4d080
Add new errors to flake8 ignore
davidt0x Aug 7, 2023
fdd7c17
Add flake8 ignores for tests as well.
davidt0x Aug 7, 2023
af97896
Pin tensorflow to fix problems with tf probability
davidt0x Aug 7, 2023
b05a7d7
Move TMPDIR to scratch on della.
davidt0x Aug 7, 2023
a2b7f55
Attempt to fix doc build error
davidt0x Aug 7, 2023
1667879
Disable -W on Sphinx
davidt0x Aug 7, 2023
fb5744e
Disable conda build on Mac for now, testing.
davidt0x Sep 5, 2023
576050f
Fixes for building conda packages.
vineetbansal Oct 2, 2023
4d40347
Pinning numpy<=1.23.1 seems to resolve SRM issues.
vineetbansal Oct 3, 2023
e51f057
Fix the numpy pin
vineetbansal Oct 3, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .conda/build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# Install from PyPI because there is no current conda package for the
# following. Explicitly install dependencies with no conda package as well
# because otherwise conda-build does not include them in the output package.
PIP_NO_INDEX=False $PYTHON -m pip install pymanopt
PIP_NO_INDEX=False $PYTHON -m pip install "pymanopt<=0.2.5"

# NOTE: This is the recommended way to install packages
$PYTHON setup.py install --single-version-externally-managed --record=record.txt
23 changes: 14 additions & 9 deletions .conda/meta.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{% set conda_package_nonexistent = (
"pymanopt",
"pymanopt<=0.2.5",
) %}
{% set data = load_setup_py_data() %}

Expand Down Expand Up @@ -34,19 +34,24 @@ requirements:

host:
- python
- setuptools
- pip
- mpich
- openmp
{% for req in data.get('setup_requires', [])
if req not in conda_package_nonexistent -%}
- {{req}}
{% endfor %}
- llvm-openmp
- setuptools>=42
- wheel
- pybind11>=2.9.0
- scipy!=1.0.0
- cython
- numpy<=1.23.1
- setuptools_scm

run:
- python
- numpy<=1.23.1
- mpich
- openmp
- llvm-openmp
- tensorflow
- tensorflow-probability
{% for req in data.get('install_requires', [])
if req not in conda_package_nonexistent -%}
- {{req}}
Expand All @@ -55,7 +60,6 @@ requirements:
test:
commands:
- find $BRAINIAK_HOME/tests | grep pycache | xargs rm -rf
- pip install tensorflow tensorflow-probability
- mpiexec -n 2 pytest $BRAINIAK_HOME

# Known issue: https://github.com/travis-ci/travis-ci/issues/4704#issuecomment-348435959
Expand All @@ -64,4 +68,5 @@ test:
- conda inspect objects -p $PREFIX brainiak # [osx]
requires:
- pytest
- testbook
- numdifftools
20 changes: 20 additions & 0 deletions .github/workflows/della_notebooks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Workflow file for test runs on Princeton's della cluster. This only runs on self
# hosted runners. The pr-check script detects that it is running on della and
# runs tests of notebooks that require larger resources than GitHub Actions hosted
# runners provide. It would probably be better to move a lot of that logic to this
# file in the future.

on: [pull_request]

concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true

jobs:
notebook_tests:
runs-on: self-hosted
steps:
- uses: actions/checkout@v2
- run: |
chmod a+x pr-check.sh
./pr-check.sh
28 changes: 18 additions & 10 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,9 @@
on: [pull_request]
on: [pull_request, push]

concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true

jobs:
pypi:
env:
Expand All @@ -7,7 +12,7 @@ jobs:
strategy:
matrix:
os: [ubuntu-20.04, macos-latest]
python-version: [3.8]
python-version: ["3.8", "3.9", "3.10"]
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
Expand All @@ -21,11 +26,12 @@ jobs:
./pr-check.sh
- if: ${{ contains(matrix.os, 'macos') }}
run: |
export CC=$(brew --prefix)/opt/llvm/bin/clang
export CXX=$(brew --prefix)/opt/llvm/bin/clang++
export LDFLAGS="-L/usr/local/opt/llvm/lib
-Wl,-rpath,/usr/local/opt/llvm/lib $LDFLAGS"
export CPPFLAGS="-I/usr/local/opt/llvm/include $CPPFLAGS"
export CLANG_PREFIX=$(brew --prefix llvm@15)
export CC=$CLANG_PREFIX/bin/clang
export CXX=$CLANG_PREFIX/bin/clang++
export LDFLAGS="-L$CLANG_PREFIX/lib
-Wl,-rpath,$CLANG_PREFIX/lib $LDFLAGS -L/usr/local/opt/libomp/lib"
export CPPFLAGS="-I$CLANG_PREFIX/include $CPPFLAGS -I/usr/local/opt/libomp/include"
./pr-check.sh
- uses: codecov/codecov-action@v1
conda:
Expand All @@ -36,10 +42,12 @@ jobs:
python-version: [3.8]
steps:
- uses: actions/checkout@v2
- uses: conda-incubator/setup-miniconda@v2
- uses: mamba-org/setup-micromamba@v1
with:
channels: conda-forge
python-version: ${{ matrix.python-version }}
environment-name: test-env
create-args: >-
python=${{ matrix.python-version }}
boa
- run: |
export CONDA_HOME=$CONDA
conda install conda-build
Expand Down
6 changes: 6 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -38,3 +38,9 @@ __MACOSX
brainiak/fcma/cython_blas.c
brainiak/eventseg/_utils.c
examples/fcma/face_scene/


docs/examples/fmrisim/Corr_MVPA_archive.tar.gz
docs/examples/iem/RademakerEtAl2019_WM_S05_avgTime.npz
docs/examples/isc/brainiak-aperture-isc-data.tgz
docs/examples/srm/brainiak-aperture-srm-data.tgz
4 changes: 2 additions & 2 deletions brainiak/factoranalysis/htfa.py
Original file line number Diff line number Diff line change
Expand Up @@ -498,7 +498,7 @@ def _init_prior_posterior(self, rank, R, n_local_subj):

if rank == 0:
idx = np.random.choice(n_local_subj, 1)
self.global_prior_, self.global_centers_cov,\
self.global_prior_, self.global_centers_cov, \
self.global_widths_var = self.get_template(R[idx[0]])
self.global_centers_cov_scaled =\
self.global_centers_cov / float(self.n_subj)
Expand Down Expand Up @@ -724,7 +724,7 @@ def _fit_htfa(self, data, R):
m = 0
outer_converged = np.array([0])
while m < self.max_global_iter and not outer_converged[0]:
if(self.verbose):
if (self.verbose):
logger.info("HTFA global iter %d " % (m))
# root broadcast first 4 fields of global_prior to all nodes
self.comm.Bcast(self.global_prior_, root=0)
Expand Down
2 changes: 1 addition & 1 deletion brainiak/fcma/preprocessing.py
Original file line number Diff line number Diff line change
Expand Up @@ -370,7 +370,7 @@ def prepare_searchlight_mvpa_data(images, conditions, data_type=np.float32,

logger.info('start to apply masks and separate epochs')
for sid, f in enumerate(images):
data = f.get_data().astype(data_type)
data = f.get_fdata().astype(data_type)
[d1, d2, d3, d4] = data.shape
if random == RandomType.REPRODUCIBLE:
data = data.reshape((d1 * d2 * d3, d4))
Expand Down
4 changes: 2 additions & 2 deletions brainiak/funcalign/fastsrm.py
Original file line number Diff line number Diff line change
Expand Up @@ -238,7 +238,7 @@ def _check_imgs_array(imgs):
for i in range(n_subjects):
for j in range(n_sessions):
if not (isinstance(imgs[i, j], str) or isinstance(
imgs[i, j], np.str_) or isinstance(imgs[i, j], np.str)):
imgs[i, j], np.str_) or isinstance(imgs[i, j], str)):
raise ValueError("imgs[%i, %i] is stored using "
"type %s which is not a str" %
(i, j, type(imgs[i, j])))
Expand Down Expand Up @@ -337,7 +337,7 @@ def check_atlas(atlas, n_components=None):
return None

if not (isinstance(atlas, np.ndarray) or isinstance(atlas, str)
or isinstance(atlas, np.str_) or isinstance(atlas, np.str)):
or isinstance(atlas, np.str_) or isinstance(atlas, str)):
raise ValueError("Atlas is stored using "
"type %s which is neither np.ndarray or str" %
type(atlas))
Expand Down
4 changes: 2 additions & 2 deletions brainiak/funcalign/rsrm.py
Original file line number Diff line number Diff line change
Expand Up @@ -191,8 +191,8 @@ def transform(self, X):
return r, s

def _transform_new_data(self, X, subject):
"""Transform new data for a subjects by projecting to the shared subspace and
computing the individual information.
"""Transform new data for a subjects by projecting to the shared
subspace and computing the individual information.

Parameters
----------
Expand Down
13 changes: 8 additions & 5 deletions brainiak/funcalign/srm.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,8 @@


def _init_w_transforms(data, features, random_states, comm=MPI.COMM_SELF):
"""Initialize the mappings (Wi) for the SRM with random orthogonal matrices.
"""Initialize the mappings (Wi) for the SRM with random orthogonal
matrices.

Parameters
----------
Expand Down Expand Up @@ -240,8 +241,8 @@ def fit(self, X, y=None):
"Not all ranks have same number of subjects")

# Collect size information
shape0 = np.zeros((number_subjects,), dtype=np.int)
shape1 = np.zeros((number_subjects,), dtype=np.int)
shape0 = np.zeros((number_subjects,), dtype=int)
shape1 = np.zeros((number_subjects,), dtype=int)

for subject in range(number_subjects):
if X[subject] is not None:
Expand Down Expand Up @@ -480,7 +481,8 @@ def save(self, file):
)

def _srm(self, data):
"""Expectation-Maximization algorithm for fitting the probabilistic SRM.
"""Expectation-Maximization algorithm for fitting the probabilistic
SRM.

Parameters
----------
Expand Down Expand Up @@ -854,7 +856,8 @@ def transform_subject(self, X):
return w

def _srm(self, data):
"""Expectation-Maximization algorithm for fitting the probabilistic SRM.
"""Expectation-Maximization algorithm for fitting the probabilistic
SRM.

Parameters
----------
Expand Down
10 changes: 10 additions & 0 deletions brainiak/funcalign/sssrm.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,16 @@
from sklearn.utils import assert_all_finite
from sklearn.utils.validation import NotFittedError
from sklearn.utils.multiclass import unique_labels

# Workaround for Theano for numpy after 1.20.3, see:
# https://github.com/numpy/numpy/issues/21079
try:
import numpy.distutils
blas_info = np.__config__.blas_ilp64_opt_info # type: ignore
numpy.distutils.__config__.blas_opt_info = blas_info # type: ignore
except Exception:
pass

import theano
import theano.tensor as T
import theano.compile.sharedvalue as S
Expand Down
27 changes: 16 additions & 11 deletions brainiak/image.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@

import itertools

from typing import Iterable, Sequence, Type, TypeVar
from typing import Optional, Iterable, Sequence, Type, TypeVar

import numpy as np

Expand Down Expand Up @@ -104,8 +104,11 @@ def extract_labels(self) -> np.ndarray:
return condition_idxs[unique_epoch_idxs]


def mask_image(image: SpatialImage, mask: np.ndarray, data_type: type = None
) -> np.ndarray:
def mask_image(
image: SpatialImage,
mask: np.ndarray,
data_type: Optional[type] = None
) -> np.ndarray:
"""Mask image after optionally casting its type.

Parameters
Expand All @@ -127,19 +130,21 @@ def mask_image(image: SpatialImage, mask: np.ndarray, data_type: type = None
ValueError
Image data and masks have different shapes.
"""
image_data = image.get_data()
image_data = image.get_fdata()
if image_data.shape[:3] != mask.shape:
raise ValueError("Image data and mask have different shapes.")
if data_type is not None:
cast_data = image_data.astype(data_type)
cast_data: np.ndarray = image_data.astype(data_type)
else:
cast_data = image_data
return cast_data[mask]


def multimask_images(images: Iterable[SpatialImage],
masks: Sequence[np.ndarray], image_type: type = None
) -> Iterable[Sequence[np.ndarray]]:
def multimask_images(
images: Iterable[SpatialImage],
masks: Sequence[np.ndarray],
image_type: Optional[type] = None
) -> Iterable[Sequence[np.ndarray]]:
"""Mask images with multiple masks.

Parameters
Expand All @@ -161,7 +166,7 @@ def multimask_images(images: Iterable[SpatialImage],


def mask_images(images: Iterable[SpatialImage], mask: np.ndarray,
image_type: type = None) -> Iterable[np.ndarray]:
image_type: Optional[type] = None) -> Iterable[np.ndarray]:
"""Mask images.

Parameters
Expand All @@ -178,5 +183,5 @@ def mask_images(images: Iterable[SpatialImage], mask: np.ndarray,
np.ndarray
Masked image.
"""
for images in multimask_images(images, (mask,), image_type):
yield images[0]
for imgs in multimask_images(images, (mask,), image_type): # type: ignore
yield imgs[0] # type: ignore
13 changes: 7 additions & 6 deletions brainiak/io.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
]

from pathlib import Path
from typing import Callable, Iterable, List, Union
from typing import Optional, Callable, Iterable, List, Union

import logging
import nibabel as nib
Expand Down Expand Up @@ -69,7 +69,7 @@ def load_images_from_dir(in_dir: Union[str, Path], suffix: str = "nii.gz",
logger.debug(
'Starting to read file %s', f
)
yield nib.load(str(f))
yield nib.load(str(f)) # type: ignore


def load_images(image_paths: Iterable[Union[str, Path]]
Expand Down Expand Up @@ -100,11 +100,12 @@ def load_images(image_paths: Iterable[Union[str, Path]]
logger.debug(
'Starting to read file %s', string_path
)
yield nib.load(string_path)
yield nib.load(string_path) # type: ignore


def load_boolean_mask(path: Union[str, Path],
predicate: Callable[[np.ndarray], np.ndarray] = None
predicate: Optional[
Callable[[np.ndarray], np.ndarray]] = None
) -> np.ndarray:
"""Load boolean nibabel.SpatialImage mask.

Expand All @@ -123,11 +124,11 @@ def load_boolean_mask(path: Union[str, Path],
"""
if not isinstance(path, str):
path = str(path)
data = nib.load(path).get_data()
data = nib.load(path).get_fdata() # type: ignore
if predicate is not None:
mask = predicate(data)
else:
mask = data.astype(np.bool)
mask = data.astype(bool)
return mask


Expand Down
14 changes: 7 additions & 7 deletions brainiak/reprsimil/brsa.py
Original file line number Diff line number Diff line change
Expand Up @@ -4080,13 +4080,13 @@ def _check_scan_onsets_GBRSA(self, scan_onsets, X):
return scan_onsets

def _bin_exp(self, n_bin, scale=1.0):
""" Calculate the bin locations to approximate exponential distribution.
It breaks the cumulative probability of exponential distribution
into n_bin equal bins, each covering 1 / n_bin probability. Then it
calculates the center of mass in each bins and returns the
centers of mass. So, it approximates the exponential distribution
with n_bin of Delta function weighted by 1 / n_bin, at the
locations of these centers of mass.
""" Calculate the bin locations to approximate exponential
distribution. It breaks the cumulative probability of
exponential distribution into n_bin equal bins, each covering
1 / n_bin probability. Then it calculates the center of mass in
each bins and returns the centers of mass. So, it approximates the
exponential distribution with n_bin of Delta function weighted by
1 / n_bin, at the locations of these centers of mass.
Parameters:
-----------
n_bin: int
Expand Down
Loading