Sorry, your browser cannot access this site
This page requires browser support (enable) JavaScript
Learn more >

I’ve had the Fydetab Duo for over a month, and I’m still trying to find ways to make the most use of this device. At the very least, I’d like to write some blogs during my downtime with it. Hence the following exploration.

This title is a bit too nested… but the fact is, I suddenly felt capable! I finally have a chance to become a full-stack developer! To showcase my capabilities enhanced by chatGPT, I decided to develop a tool that uses chatGPT to read papers (I’ve also considered using it for meta-analysis).

After upgrading my blog, I decided to do some cosmetic work on my GitHub profile…

When testing, it’s often necessary to create a new conda environment and then install the jupyter kernel within that environment for use with notebooks. Re-registering new kernels each time can be cumbersome, so here is a record of the process…

I currently rarely use R for data processing or cleaning because my daily work involves a lot of string extraction/processing, which is difficult to do in R. Additionally, the error tracking in R is very demanding on code proficiency and experience, making it almost unbearable for me to write and maintain R code (I tried at a previous company…). However, recently one of my colleagues asked me to use R to complete such tasks because he only knows R. In the process of copying code, I discovered another reason why I don’t need to use R for this type of work…

Since the start of the pandemic, I have had some remote work needs. Today, I finally put together a relatively convenient and nearly VPN-like experience for accessing inner networks.

In daily work, there are always times when you need to create a presentation document to show project progress or results. However, creating a PPT is quite time-consuming (my obsessive-compulsive disorder always makes me adjust the font and position). I’ve never used any of the fancy elements and features of PPTs, so I found Marp to save some effort.

Recently, while helping a colleague create plots, I uncovered some new issues and solutions related to ggpubr and survival analysis packages. Here are the records.

A while ago, I implemented an interesting interactive data processing case using Dash. Here, I’ll record it.

I previously tried to use Scrapy to crawl information from a drug website, but I didn’t make any records at the time. This time, I helped a classmate crawl some data, so I decided to record it this time.