更新版LaTeX Companion第8章:高级数学

需积分: 10 0 下载量 114 浏览量 更新于2024-07-17 收藏 586KB PDF 举报
"latexcomp-ch8.pdf 是《LaTeX Companion》第八章的更新版,针对AMS-LaTeX 1.2版本进行了修订,修正了其中过时的部分,但整体结构和内容保持不变,不是整个书籍的全面修订。此文档专注于高等数学的排版,介绍了如何在LaTeX中更高效地处理复杂的数学公式和构造。" 在LaTeX中,基本功能已经提供了强大的数学公式排版能力。然而,当需要频繁输入复杂的方程式或其他数学结构时,用户需要定义新的命令或环境来简化输入工作。AMS-LaTeX是LaTeX的一个扩展,特别增强了对数学公式的支持,尤其适合处理高级数学表达。 AMS-LaTeX 1.2版本的更新可能包括了新的宏包、命令的改进或者对原有功能的优化,使得用户能够更加灵活和精确地表示数学概念。例如,它可能增加了对多行公式的支持,改进了矩阵和数组的排版,或者优化了符号的间距和对齐方式,以提高阅读和理解的清晰度。 在《LaTeX Companion》的第八章中,读者可以期待学习到以下内容: 1. **定义新命令**:如何自定义命令来快速输入常见的数学符号或短语,减少重复劳动。 2. **数学环境**:如`equation`、`align`、`alignat`等环境的用法,用于排列单行或多行的数学公式,保持良好的对齐和编号。 3. **上下标与分数**:正确使用上标和下标,以及如何输入分数和根号。 4. **大括号和数组**:创建大括号(用于集合或矩阵)以及各种类型的数组和表格。 5. **定理和证明**:定义和格式化定理、引理、推论、证明等结构。 6. **符号与运算**:了解并使用各种数学符号,如积分、微分、极限、箭头等。 7. **宏包的利用**:如`amsmath`、`amsfonts`和`amssymb`等宏包,它们提供了额外的数学符号和格式选项。 8. **特殊结构**:处理积分、级数、微分方程等复杂结构的方法。 9. **参考文献和索引**:在数学文档中如何组织引用和创建索引,确保资料的准确性和完整性。 通过学习这份更新的章节,用户将能够更有效地利用LaTeX来处理高级数学的排版需求,提高论文或报告的专业性和可读性。同时,了解这些高级特性也有助于编写出更加整洁和易于维护的LaTeX源代码。
2018-04-12 上传
x## deepHAR Code repository for experiments on deep architectures for HAR in ubicomp. Using this code you will be able to replicate some of the experiments described in our IJCAI 2016 paper: ``` @article{hammerla2016deep, title={Deep, convolutional, and recurrent models for human activity recognition using wearables}, author={Hammerla, Nils Y and Halloran, Shane and Ploetz, Thomas}, journal={IJCAI 2016}, year={2016} } ``` ## Disclaimer This code is still incomplete. At the moment only the bi-directional RNN will work on the opportunity data-set. ## Installation ``` git clone https://github.com/torch/distro.git ~/torch --recursive cd ~/torch; bash install-deps; ./install.sh # after installation, we need some additional packages #HDF5 luarock sudo apt-get install libhdf5-serial-dev hdf5-tools git clone https://github.com/deepmind/torch-hdf5 cd torch-hdf5 luarocks make hdf5-0-0.rockspec LIBHDF5_LIBDIR="/usr/lib/x86_64-linux-gnu/" # json luarocks install json # RNN support luarocks install torch luarocks install nn luarocks install dpnn luarocks install torchx luarocks install rnn # we use python3 pip3 install h5py pip3 install simplejson pip3 install numpy ``` ## Usage First download and extract the Opportunity dataset. Then use the provided python script in the `data` directory to prepare the training/validation/test sets. ``` cd data python3 data_reader.py opportunity /path/to/OpportunityUCIDataset ``` This will generate two hdf5-files that are read by the lua scripts, `opportunity.h5` and `opportunity.h5.classes.json`. To train the bi-directional RNN that we have found to work best on this set run the following commands: ``` cd models/RNN th main_brnn.lua -data ../../data/opportunity.h5 -cpu \ -layerSize 179 -maxInNorm 2.283772707 \ -learningRate 0.02516758 -sequenceLength 81 \ -carryOverProb 0.915735543 -numLayers 1 \ -logdir EXP_brnn ``` This will train a model only using your CPUs, which will take a while (make sure you have some form of BLAS library installed). On my laptop this will take approx. 5 min per epoch, and it will likely not converge before epoch 60. If your environment is set up for gpu-based computation, try using `-gpu 1` instead of the `-cpu` flag for a significant speedup. ## Other models The python-based `data_reader.py` is new and substitutes for the original but unmaintainable Matlab-scripts used previously. So far it only supports `opportunity` and sample-based evaluation, which will be addressed shortly.
2023-06-08 上传