Facets (new session)
Description
Metadata
Settings
owl:sameAs
Inference Rule:
b3s
b3sifp
dbprdf-label
facets
http://dbpedia.org/resource/inference/rules/dbpedia#
http://dbpedia.org/resource/inference/rules/opencyc#
http://dbpedia.org/resource/inference/rules/umbel#
http://dbpedia.org/resource/inference/rules/yago#
http://dbpedia.org/schema/property_rules#
http://www.ontologyportal.org/inference/rules/SUMO#
http://www.ontologyportal.org/inference/rules/WordNet#
http://www.w3.org/2002/07/owl#
ldp
oplweb
skos-trans
virtrdf-label
None
About:
Does the $/ell_1$-norm Learn a Sparse Graph under Laplacian Constrained Graphical Models?
Goto
Sponge
NotDistinct
Permalink
An Entity of Type :
schema:ScholarlyArticle
, within Data Space :
covidontheweb.inria.fr
associated with source
document(s)
Type:
Academic Article
research paper
schema:ScholarlyArticle
New Facet based on Instances of this Class
Attributes
Values
type
Academic Article
research paper
schema:ScholarlyArticle
isDefinedBy
Covid-on-the-Web dataset
title
Does the $/ell_1$-norm Learn a Sparse Graph under Laplacian Constrained Graphical Models?
Creator
Cardoso, M
Palomar, Daniel
Vinícius, José
Ying, Jiaxi
source
ArXiv
abstract
We consider the problem of learning a sparse graph under Laplacian constrained Gaussian graphical models. This problem can be formulated as a penalized maximum likelihood estimation of the precision matrix under Laplacian structural constraints. Like in the classical graphical lasso problem, recent works made use of the $/ell_1$-norm regularization with the goal of promoting sparsity in Laplacian structural precision matrix estimation. However, we find that the widely used $/ell_1$-norm is not effective in imposing a sparse solution in this problem. Through empirical evidence, we observe that the number of nonzero graph weights grows with the increase of the regularization parameter. From a theoretical perspective, we prove that a large regularization parameter will surprisingly lead to a fully connected graph. To address this issue, we propose a nonconvex estimation method by solving a sequence of weighted $/ell_1$-norm penalized sub-problems and prove that the statistical error of the proposed estimator matches the minimax lower bound. To solve each sub-problem, we develop a projected gradient descent algorithm that enjoys a linear convergence rate. Numerical experiments involving synthetic and real-world data sets from the recent COVID-19 pandemic and financial stock markets demonstrate the effectiveness of the proposed method. An open source $/mathsf{R}$ package containing the code for all the experiments is available at https://github.com/mirca/sparseGraph.
has issue date
2020-06-26
(
xsd:dateTime
)
has license
arxiv
sha1sum (hex)
9642b2b5a46b8a627ce410e86fbb1f60d08156a0
resource representing a document's title
Does the $/ell_1$-norm Learn a Sparse Graph under Laplacian Constrained Graphical Models?
resource representing a document's body
covid:9642b2b5a46b8a627ce410e86fbb1f60d08156a0#body_text
is
schema:about
of
named entity 'regularization'
named entity 'estimator'
named entity 'theoretical perspective'
named entity 'Laplacian'
named entity 'linear convergence'
»more»
◂◂ First
◂ Prev
Next ▸
Last ▸▸
Page 1 of 8
Go
Faceted Search & Find service v1.13.91 as of Mar 24 2020
Alternative Linked Data Documents:
Sponger
|
ODE
Content Formats:
RDF
ODATA
Microdata
About
OpenLink Virtuoso
version 07.20.3229 as of Jul 10 2020, on Linux (x86_64-pc-linux-gnu), Single-Server Edition (94 GB total memory)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2025 OpenLink Software