최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기IEEE access : practical research, open solutions, v.9, 2021년, pp.48544 - 48554
Ahn, Pyunghwan (Korea Advanced Institute of Science and Technology, School of Electrical Engineering, Daejeon, South Korea) , Hong, Hyeong Gwon (Korea Advanced Institute of Science and Technology, Graduate School of AI, Daejeon, South Korea) , Kim, Junmo (Korea Advanced Institute of Science and Technology, School of Electrical Engineering, Daejeon, South Korea)
Neural architecture search (NAS) is an automated method searching for the optimal network architecture by optimizing the combinations of edges and operations. For efficiency, recent differentiable architecture search methods adopt a one-shot network, containing all the candidate operations in each e...
arXiv 1412 6980 Adam: A method for stochastic optimization kingma 2015
J Mach Learn Res Adaptive subgradient methods for online learning and stochastic optimization duchi 2011 12 2121
arXiv 2004 08996 Local search is a remarkably strong baseline for neural architecture search den ottelander 2020
Chen, Liang-Chieh, Papandreou, George, Kokkinos, Iasonas, Murphy, Kevin, Yuille, Alan L.. DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs. IEEE transactions on pattern analysis and machine intelligence, vol.40, no.4, 834-848.
Real, Esteban, Aggarwal, Alok, Huang, Yanping, Le, Quoc V.. Regularized Evolution for Image Classifier Architecture Search. Proceedings of the ... aaai conference on artificial intelligence, vol.33, 4780-4789.
Proc Int Conf Learn Represent Darts: Differentiable architecture search liu 2019 1
Proc Int Conf Mach Learn Understanding and simplifying one-shot architecture search bender 2018 549
Proc Uncertainty Artif Intell Random search and reproducibility for neural architecture search li 2020 367
Nesterov, Y.. Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems. SIAM journal on optimization : a publication of the Society for Industrial and Applied Mathematics, vol.22, no.2, 341-362.
arXiv 1909 00918 Efficiency of coordinate descent methods for structured nonconvex optimization deng 2019
Patrascu, Andrei, Necoara, Ion. Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization. Journal of global optimization : an international journal dealing with theoretical and computational aspects of seeking global optima and their applications in science, management and engineering, vol.61, no.1, 19-46.
Proc Int Conf Learn Represent Hierarchical representations for efficient architecture search liu 2018 1
arXiv 2005 02960 Local search is state of the art for neural architecture search benchmarks white 2020
Proc Adv Neural Inf Process Syst Faster R-CNN: Towards real-time object detection with region proposal networks ren 2015 91
Proc Int Conf Learn Represent Very deep convolutional networks for large-scale image recognition simonyan 2015 1
Proc Adv Neural Inf Process Syst Imagenet classification with deep convolutional neural networks krizhevsky 2012 1097
Proc 13th AAAI Conf Artif Intell Fast lasso algorithm via selective coordinate descent fujiwara 2016 1561
arXiv 1605 07146 Wide residual networks zagoruyko 2016
Proc Int Conf Learn Represent SNAS: Stochastic neural architecture search xie 2019 1
Proc Int Conf Mach Learn Efficient neural architecture search via parameters sharing pham 2018 4095
arXiv 1708 04552 Improved regularization of convolutional neural networks with cutout devries 2017
Proc Int Conf Learn Represent Proxylessnas: Direct neural architecture search on target task and hardware cai 2018 1
해당 논문의 주제분야에서 활용도가 높은 상위 5개 콘텐츠를 보여줍니다.
더보기 버튼을 클릭하시면 더 많은 관련자료를 살펴볼 수 있습니다.
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
오픈액세스 학술지에 출판된 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.