Stars
阿布量化交易系统(股票,期权,期货,比特币,机器学习) 基于python的开源量化交易,量化投资架构
Maximum target coverage by adjusting the orientation of distributed sensors in directional sensor networks using Reinforcement Learning
This repository is the official implementation of Learning Multi-Agent Coordination for Enhancing Target Coverage in Directional Sensor Networks.
🇨🇳 GitHub中文排行榜,各语言分设「软件 | 资料」榜单,精准定位中文好项目。各取所需,高效学习。
This program uses Attention and Coverage to realize HMER and this program is based on Pytorch.
Semantic Graph Representation Learning for Handwritten Mathematical Expression Recognition (ICDAR 2023)
Official implementation for ECCV 2022 paper "CoMER: Modeling Coverage for Transformer-based Handwritten Mathematical Expression Recognition"
Syntax-Aware Network for Handwritten Mathematical Expression Recognition
When Counting Meets HMER: Counting-Aware Network for Handwritten Mathematical Expression Recognition (ECCV’2022 Poster).
这是我的学习过程中自己整理的资料,实验报告等。电子科技大学 大数据 作业答案 实验报告
Image stitching algorithm with Optical Flow, RANSAC, and Minimum Cost Path. Object detection with Faster-RCNN in Pytorch
About Partial Differential Equations (PDEs) of 2D Heat Transfer is solved by Physic-Informed Neural Network (PINN), while time is discretized by Euler's Method
This is a repository containing the different MATLAB codes and the .mat archives with the data samples that are referenced to within my thesis.
A pytorch implementation of UnsupervisedDeepImageStitching
电子科技大学 2020 级《统计学习与模式识别》课程代码。
These are my codes for the new and restructured UFLDL course written in Python. The site is http://ufldl.stanford.edu/tutorial/
Stanford UFLDL Tutorial Exercises Based on Python
Code for "LoFTR: Detector-Free Local Feature Matching with Transformers", CVPR 2021, T-PAMI 2022
使用Python-OpenCV对多张图片进行全景图像拼接,消除鬼影,消除裂缝。
《最优化导论》第1 2 3 4 5 6 7 8 9 10 11 13 20 21 22 23章LaTeX公式笔记
CVPR2022 (Oral) - Deep Rectangling for Image Stitching: A Learning Baseline
TIP2021 - Unsupervised deep image stitching network
支持多张图片进行图像拼接 It can use more than two pictures to make a panorama.