I am “Ramón y Cajal” fellow at the Universitat de Barcelona. I develop and tailor solutions for social science research methods, including current developments in Bayesian inference, data visualization, probabilistic programming, experimental designs and machine learning. I have substantial contributions in comparative politics, public administration, public policy, international relations and psychology.
I have worked in the fields of global governance and IGOs, the diffusion of policies and institutions and the processes of development of regulatory agencies. I work also on Internet and e-Government diffusion and other related aspects of the public management of the Information Society, which lately include the adoption of Artificial Intelligence in public administration.
PhD in Political Science
Universitat Pompeu Fabra
PGDip in Social Science Data Analysis
University of Essex
Courses and Seminars currently offered
Software development for the advance of research
My research is centered on the methodology of social sciences, developing innovative quantitative frameworks for the analysis of public policy. I have designed data-gathering processes and implemented and programmed large databases to hold the complexity of the objects of study.
When analyzing data, my aim is always to develop new methods to extract as much knowledge as possible from the available information, being very strict with regards to scientific validity and theoretical innovation. To achieve this I have created innovative designs for the data collection phase, developed better software to manage the data, and invented new techniques for its analysis.
I have created and mantain ggmcmc, an R package aimed at providing tools for assessing and diagnosing convergence of Markov Chain Monte Carlo simulations, as well as for graphically display results from full MCMC analysis.
I have published an article that develops a method to extract configurations of values (social, economical, pragmatic) from organizations, using a combination of experts and ignoramus (several individuals in a “big data” sort of approach): Extracting configurations of values mixing scores from experts and ignoramus using Bayesian modelling.
I have created a package for analyzing PolicyPortfolios in the context of the work at the Consensus and Accupol projects at the Chair of Empirical Theory of Politics.
An article at Policy Sciences that introduces policy portfolios and its use was awarded with the prize for bureaucracy research of the Cologne Institute for Economic research. The online appendix includes several videos with the evolution of policy portfolios in the countries analyzed.
The data and package has also been the foundation of the article “Studying Policy Design Quality in Comparative Perspective” published in the American Political Science Review. doi: 11.1017/S0003055421000186.
I have created and mantain a docker image for making Bayesian inference using JAGS and ggmcmc easy in high- computing environments. This facilitates analysis of large datasets and Big Data, as well as high-demanding computing power in MCMC simulations.
process_nici.zip is a python script that reads Gas chromatography–mass spectrometry (GS-MC) files and organizes samples, compounds and concentrations in a way that is easy to process later by statistical software.
It is described in the document Processing the quantification of samples from a NICI
st_coeffs.pl is a perl script that processess coefficients from logistic regression in SPSS and produces standardized coefficients (2004).
My github account shows the projects in which I am involved, including maintaining the JAGS ebuild for gentoo.
I am member of the Foundation for Open Access Statistics, which promotes free software, open access publishing and reproducible research in statistics.
I am member of the Open Science Center of the LMU Munich, an interdisciplinary center to promote and foster open science practices.