<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Axect&#39;s Blog</title>
    <link>https://axect.github.io/</link>
    <description>Recent content on Axect&#39;s Blog</description>
    <generator>Hugo</generator>
    <language>en</language>
    <copyright>&lt;a href=&#34;https://creativecommons.org/licenses/by-nc/4.0/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;CC BY-NC 4.0&lt;/a&gt;</copyright>
    <lastBuildDate>Mon, 09 Feb 2026 17:00:00 +0900</lastBuildDate>
    <atom:link href="https://axect.github.io/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>Research</title>
      <link>https://axect.github.io/research/</link>
      <pubDate>Mon, 09 Feb 2026 17:00:00 +0900</pubDate>
      <guid>https://axect.github.io/research/</guid>
      <description>&lt;h2 id=&#34;research-overview&#34;&gt;Research Overview&lt;/h2&gt;&#xA;&lt;p&gt;My research sits at the intersection of &lt;strong&gt;dark matter phenomenology&lt;/strong&gt; and &lt;strong&gt;scientific machine learning&lt;/strong&gt;. One part of my work studies primordial black holes (PBHs), axion-like particles (ALPs), and the photon, positron, and neutrino signals they can generate. The other part develops machine learning models and computational frameworks that preserve physical structure while accelerating expensive calculations, solving inverse problems, and reconstructing dynamics from limited information.&lt;/p&gt;&#xA;&lt;p&gt;With training that began in astronomy and continued through a Ph.D. in theoretical particle physics, I have built a research program that moves naturally between astroparticle phenomenology and AI-driven modeling. Work at Yonsei HEP-COSMO established this foundation, and my current appointments at Fudan University and RIKEN iTHEMS are extending it toward precision phenomenology and physics-informed AI. I do not treat these as separate tracks: phenomenology motivates the computational questions, and machine learning becomes useful only when it respects the structure of the underlying physics.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Software &amp; Projects</title>
      <link>https://axect.github.io/software/</link>
      <pubDate>Mon, 09 Feb 2026 15:20:00 +0900</pubDate>
      <guid>https://axect.github.io/software/</guid>
      <description>&lt;p&gt;Open-source software and research projects by Tae-Geun Kim (Axect).&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;h2 id=&#34;-active-projects&#34;&gt;🚀 Active Projects&lt;/h2&gt;&#xA;&lt;h3 id=&#34;peroxide&#34;&gt;Peroxide&lt;/h3&gt;&#xA;&lt;center&gt;&#xA;&lt;p style=&#34;text-align:center&#34;&gt;&lt;strong&gt;Rust numerical computing library&lt;/strong&gt;&#xA;&lt;img src=&#34;https://img.shields.io/badge/Rust-DEA584?style=flat-square&amp;amp;logo=rust&amp;amp;logoColor=black&#34; alt=&#34;Rust&#34;&gt; &lt;img src=&#34;https://img.shields.io/github/stars/Axect/Peroxide?style=flat-square&#34; alt=&#34;Stars&#34;&gt; &lt;img src=&#34;https://img.shields.io/crates/v/peroxide?style=flat-square&#34; alt=&#34;Crates.io&#34;&gt; &lt;img src=&#34;https://img.shields.io/crates/d/peroxide?style=flat-square&#34; alt=&#34;Downloads&#34;&gt;&lt;/p&gt;&#xA;&lt;/center&gt;&#xA;&lt;p&gt;Comprehensive numerical computing library for Rust, providing functionality comparable to NumPy/SciPy in Python. Core infrastructure for scientific computing research.&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;Key Features:&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Linear algebra with BLAS/LAPACK integration&lt;/li&gt;&#xA;&lt;li&gt;Optimization algorithms (Gradient Descent, Levenberg-Marquardt, etc.)&lt;/li&gt;&#xA;&lt;li&gt;Numerical integration &amp;amp; ODE/PDE solvers&lt;/li&gt;&#xA;&lt;li&gt;Statistical distributions &amp;amp; special functions&lt;/li&gt;&#xA;&lt;li&gt;DataFrame with multiple I/O formats (CSV, NetCDF, JSON, Parquet)&lt;/li&gt;&#xA;&lt;li&gt;FFI support for C, Fortran, and Python&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;&lt;strong&gt;Tech Stack:&lt;/strong&gt; SIMD, BLAS, LAPACK, proc-macro metaprogramming&lt;/p&gt;</description>
    </item>
    <item>
      <title>About</title>
      <link>https://axect.github.io/about/</link>
      <pubDate>Mon, 06 Nov 2023 19:03:00 +0900</pubDate>
      <guid>https://axect.github.io/about/</guid>
      <description>&lt;p&gt;About &lt;a href=&#34;https://github.com/Axect&#34;&gt;&lt;strong&gt;Tae Geun Kim (Axect)&lt;/strong&gt;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;h2 id=&#34;i-am&#34;&gt;I am&lt;/h2&gt;&#xA;&lt;p&gt;Researcher &amp;amp; Rustacean&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;h2 id=&#34;experience&#34;&gt;Experience&lt;/h2&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Joint Postdoc: Institute of Modern Physics, Fudan University &amp;amp; RIKEN iTHEMS (2025.10 ~ )&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;hr&gt;&#xA;&lt;h2 id=&#34;education&#34;&gt;Education&lt;/h2&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Ph.D.: Department of Physics, Yonsei University (2017.03 ~ 2025.08)&lt;/li&gt;&#xA;&lt;li&gt;B.S.: Department of Astronomy, Yonsei University (2012.03 ~ 2017.02)&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;hr&gt;&#xA;&lt;h2 id=&#34;research-interests&#34;&gt;Research Interests&lt;/h2&gt;&#xA;&lt;p&gt;My research bridges &lt;strong&gt;dark matter phenomenology&lt;/strong&gt; and &lt;strong&gt;AI4Science&lt;/strong&gt;, exploring primordial black holes, axion-like particles, and developing operator learning frameworks for physics. For detailed research vision and publications, please visit the &lt;a href=&#34;https://axect.github.io/research/&#34;&gt;Research&lt;/a&gt; page.&lt;/p&gt;</description>
    </item>
    <item>
      <title>📊 Piecewise Rejection Sampling</title>
      <link>https://axect.github.io/posts/006_prs/</link>
      <pubDate>Fri, 18 Nov 2022 17:49:04 +0900</pubDate>
      <guid>https://axect.github.io/posts/006_prs/</guid>
      <description>&lt;figure&gt;&#xA;    &lt;img src=&#34;https://axect.github.io/posts/images/006_01_test_dist.png&#34;&#xA;         alt=&#34;Differential energy spectrum of ALPs from primordial black hole (PBH)${}^{[1]}$&#34;/&gt; &lt;figcaption style=&#34;text-align:center&#34;&gt;&#xA;            &lt;p&gt;Differential energy spectrum of ALPs from primordial black hole (PBH)&lt;a href=&#34;https://axect.github.io/#footnotes&#34;&gt;${}^{[1]}$&lt;/a&gt;&lt;/p&gt;&#xA;        &lt;/figcaption&gt;&#xA;&lt;/figure&gt;&#xA;&lt;p&gt;Suppose you&amp;rsquo;re presented with an unnormalized probability density function graph such as the one shown. If you&amp;rsquo;re tasked with generating 10,000 data points that conform to this distribution, what would be your approach?&lt;/p&gt;&#xA;&lt;p&gt;Here are the two most commonly used methods to sample data from an arbitrary probability density function:&lt;/p&gt;&#xA;&lt;ol&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://en.wikipedia.org/wiki/Inverse_transform_sampling&#34;&gt;Inverse Transform Sampling&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://en.wikipedia.org/wiki/Rejection_sampling&#34;&gt;Rejection Sampling&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;p&gt;Inverse Transform Sampling involves generating data points by calculating the cumulative distribution function (CDF) of the probability density function, deriving its inverse function, and then using this inverse function to produce data points. This method can be quite efficient, but if the exact form of the probability density function is unknown—as in our case—it becomes challenging to apply&lt;a href=&#34;https://axect.github.io/posts/006_prs/#footnotes&#34;&gt;${}^{[2]}$&lt;/a&gt;. On the other hand, Rejection Sampling is versatile and can be utilized regardless of the probability density function&amp;rsquo;s form. Therefore, we&amp;rsquo;ll begin with Rejection Sampling.&lt;/p&gt;</description>
    </item>
    <item>
      <title>💔 Decorrelation &#43; Deep learning = Generalization</title>
      <link>https://axect.github.io/posts/005_decov/</link>
      <pubDate>Sat, 29 Oct 2022 17:39:54 +0900</pubDate>
      <guid>https://axect.github.io/posts/005_decov/</guid>
      <description>&lt;figure&gt;&#xA;    &lt;img src=&#34;https://axect.github.io/posts/images/005_01_paper.png&#34;&#xA;         alt=&#34;arXiv: 1511.06068&#34;/&gt; &lt;figcaption style=&#34;text-align:center&#34;&gt;&#xA;            &lt;p&gt;&lt;a href=&#34;https://arxiv.org/abs/1511.06068&#34;&gt;arXiv: 1511.06068&lt;/a&gt;&lt;/p&gt;&#xA;        &lt;/figcaption&gt;&#xA;&lt;/figure&gt;&#xA;&lt;p&gt;  The most pervasive challenge in deep learning is &lt;span style=&#34;background-color: rgba(255, 255, 0, 0.534);&#34;&gt;&#xA;    &lt;b&gt;Overfitting&lt;/b&gt;&#xA;&lt;/span&gt;. This occurs when the dataset is small, and extensive training leads to a model that excels on training datasets but fails to generalize to validation datasets or real-world scenarios. To address this issue, various strategies have been developed. Historically, in statistics, regularization methods like Ridge and LASSO were employed, while deep learning has adopted approaches such as regularizing weights or applying different techniques to neural networks. These techniques encompass a range of methods.&lt;/p&gt;</description>
    </item>
  </channel>
</rss>
