forked from cmccabe/linkulator
693 lines
22 KiB
Bash
Executable File
693 lines
22 KiB
Bash
Executable File
#!/bin/bash -p
|
|
|
|
# /\
|
|
# _ _ the_link aggreGator _ **
|
|
# | (_)_ __ | | ___ _| | __ _| |_ ___ _ __ \_}}_/
|
|
# | | | '_ \| |/ / | | | |/ _` | __/ _ \| '__| {{
|
|
# | | | | | | <| |_| | | (_| | || (_) | | \_}}_/
|
|
# |_|_|_| |_|_|\_\\__,_|_|\__,_|\__\___/|_| {{ ,
|
|
# >>>^
|
|
|
|
# $PROGNAME is a command line link aggregator designed for small,
|
|
# trusting shell communities. This program allows users to share
|
|
# annotated links with others, browse links shared by others, and
|
|
# engage in brief discussions about the links.
|
|
#
|
|
# Functions include:
|
|
#
|
|
# POST NEW LINK, with annotations including keywords, title and
|
|
# short description. Links are not limited to http, but can include
|
|
# gopher and ssh as well.
|
|
#
|
|
# BROWSE LINKS that others have posted, including looking at details
|
|
# of individual links or even following links via lynx.
|
|
#
|
|
# VIEW LINK IN lynx. (bad URLs too, so user beware)
|
|
#
|
|
# COMMENT ON LINKS and view comments that have accrued on a link.
|
|
#
|
|
# Search all links (including old, hidden links) by words saved as
|
|
# keywords. To do: probably should add ability to search by URLs.
|
|
#
|
|
# PRINT-TO-GOPHER : create report of recent links in plaintext format
|
|
# suitable for serving via gopher.
|
|
#
|
|
# MANAGE DISK SPACE OR DISPLAY SPACE with optional program modes:
|
|
# * auto-clean: delete old links after a given period of inactivity.
|
|
# * auto-hide: only display recently touched links in browse mode.
|
|
# * fire-hose: display all links in order from most recently touched.
|
|
# (NOTE: auto-clean doesn't yet work. See below.)
|
|
#
|
|
# FEATURES THAT HAVE NOT BEEN DEVELOPED YET:
|
|
#
|
|
# * SAVE LINK TO PERSONAL FILE
|
|
# * PRINT HELP INFO FOR THIS PROGRAM
|
|
# * FLAG LINKS AS BAD/BROKEN SO THEY CAN BE FIXED OR REMOVED
|
|
# * ANY USER CAN ADD ADDITIONAL KEYWORDS TO A LINK
|
|
|
|
# THIS IS THE "BREAK ME" VERSION. PLEASE SHOOT THIS PROGRAM WITH
|
|
# A FIREHOSE OF BAD INPUT AND DOCUMENT ANY PROBLEMS YOU FIND.
|
|
|
|
# QUESTIONS? COMMENTS? SMART REMARKS?
|
|
# --> cmccabe@sdf.org
|
|
|
|
#
|
|
# LICENSE: THIS SOFTWARE IS LICENSED FOR USE IN DEFENDING THE MASSES
|
|
# FROM THE FORCES OF TOTALITARIANISM AND CORPORATOCRACY. IT IS NOT
|
|
# DESIGNED TO SUPPORT ARMED CONFLICT, BUT FOR ADDING TO A SUITE OF
|
|
# FUNDAMENTAL TOOLS CITIZENS NEED TO AVOID AVOID BEING DUMBED DOWN
|
|
# INTO ACQUIESCENCE -- TOOLS FOR COMMUNICATION, EDUCATION, AND
|
|
# AWARENESS ABOUT CURRENT EVENTS, AND FOR CONNECTING AND ORGANIZING
|
|
# WITH FELLOW HUMANS.
|
|
#
|
|
|
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
|
|
|
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
|
|
#
|
|
# LIMITATIONS AND ISSUES:
|
|
|
|
# * CANNOT RELIABLY PREVENT DUPLICATE LINK POSTINGS BECAUSE MANY
|
|
# URLS MAY HAVE MEANINGFUL QUERY STRINGS THAT DIFFER BETWEEN
|
|
# LINKS TO THE SAME RESOURCE.
|
|
# ...SO DUPLICATE CHECKING IS NOT PERFORMED.
|
|
#
|
|
# * LINK VALIDATION IS HARD, ESPECIALLY WHEN ALLOWING FOR MULTIPLE
|
|
# PROTOCOLS (E.G. HTTP, GOPHER AND SSH), SO IT IS POSSIBLE THAT
|
|
# USERS MAY SOMETIMES ENTER BAD LINKS.
|
|
#
|
|
# * THERE IS CURRENTLY NO ABILITY FOR USERS TO EDIT ENTRIES ONCE
|
|
# SUBMITTED. A REQUEST TO THE ADMIN WILL BE NECESSARY IN THE CASE
|
|
# THAT EDITS ARE REQUIRED. THIS WILL BE RE-BRANDED AS A FEATURE,
|
|
# NOT A BUG: "PROMOTES INTERACTION WITH FRIENDLY SYSTEM ADMIN!"
|
|
#
|
|
# * SETUID WRAPPER DOES NOT (YET) PASS COMMAND LINE ARGUMENTS TO
|
|
# THE BASH SCRIPT, SO PRINT-TO-GOPHER MODE MUST BE TRIGGERED
|
|
# BY CALLING THE SCRIPT DIRECTLY. WHICH IS OK, BECAUSE SETUID
|
|
# IS ONLY NEEDED WHEN ANOTHER USER WANTS TO WRITE TO FILES.
|
|
|
|
#####################################################################
|
|
|
|
# TO DO'S:
|
|
|
|
# IF LINK DOESN'T START WITH GOPHER:// HTTPS:// OR HTTP://, HIDE LYNX OPTION
|
|
|
|
# DISCARDING ADD-LINK EXITS ENTIRE PROGRAM
|
|
|
|
# NEED TO PROTECT AGAINST RACE CONDITIONS IN WHICH A FILE OR DIRECTORY
|
|
# HAS CHANGED WHILE THE USER IS EDITING A REPLY. THIS IS A DISTINCT
|
|
# POSSIBILITY IF THE PROGRAM IS RSYNC FEDERATED WITH OTHER SYSTEMS AND
|
|
# IF FILES MIGHT BE DELETED OR CHANGED BY AN ADMIN.
|
|
|
|
# SETUID IS A RISKY WAY TO RUN THIS SCRIPT. IS THERE A BETTER APPROACH
|
|
# WHILE STILL USING BASH, OR DOES IT NEED TO BE PORTED TO A COMPILED
|
|
# LANGUAGE THAT CAN SAFELY BE RUN SETUID?
|
|
|
|
# CLEAN UP - CREATE MODES THAT CAN BE CONFIG'ED ON/OFF (HIDE/HOSE DONE)
|
|
# * AUTO-CLEAN: BASED ON CONFIG'ED DATES, REMOVE OLD FILES (DESTRUCTIVE)
|
|
# -- AUTO-CLEAN MIGHT BE BETTER AS AUTO-ARCHIVE, SO THAT OLD LINKS
|
|
# DON'T GET LOST
|
|
# * AUTO-HIDE: DO NOT DISPLAY FILES UNTOUCHED SINCE CONFIG'ED DATE
|
|
# * FIRE-HOSE: DISPLAY EVERYTHING, IN ORDER OF MOST RECENT TOUCH FIRST
|
|
# -- MAYBE JUST ADD A VERSION OF CLEAN TO AN ADMIN FUNCTION?
|
|
|
|
# PAGINATE OUTPUT - IN BROWSE MODE, AND POSSIBLY IN INDIV LINK VIEW MODE
|
|
|
|
# BETTER COLORIZE SOME TEXT FOR STATUS INDICATION:
|
|
# * https://stackoverflow.com/questions/5947742/how-to-change-the-output-color-of-echo-in-linux
|
|
|
|
# HARDEN THE PROGRAM AS MUCH AS REASONABLY POSSIBLE, INCLUDING
|
|
# PREVENTION OF USERS DEFACING SUBMISSIONS FROM OTHER USERS.
|
|
# * using nano -R rather than $EDITOR (done -- nice tip, solderpunk!)
|
|
# * and using lynx -restrictions=all (done -- thanks asdf@tilde.town!)
|
|
# * using id rather than $USER (done)
|
|
# * setuid wrapper? stackoverflow unanimously says 'no no no', but is
|
|
# it ok for a small, trusting pubnix? see solderpunk's raise/lower
|
|
# version. can that be done in bash?
|
|
#
|
|
|
|
# ALLOW RSYNC FEDERATION WITH OTHER PUBNIXES
|
|
# -- OR WHAT IS THE BEST FEDERATION APPROACH?
|
|
|
|
# ALLOW USERS TO FLAG LINKS AS BROKEN SO THAT THEY CAN BE REMOVED.
|
|
|
|
# DECIDE HOW TO VALIDATE DIFFERENT PROTOCOLS
|
|
# * HTTP response can be used to validate some but not all HTTP
|
|
# links. But see circumulunar.space as an exception.
|
|
# * how to validate all links without downloading the linked file?
|
|
# * should allow https, http, gopher, ssh. And more?
|
|
# * once validated, lynx used to view
|
|
# * !! need to if-then block lynx from trying any url that doesn't
|
|
# start with http or gopher (e.g. ssh:// )
|
|
#
|
|
# FINISH KEYWORD VALIDATION CHECKS
|
|
# * indiv keywords length restrictions
|
|
# * total char count of all keywords combined (done)
|
|
|
|
# CHECK FOR DEPENDENCIES AND FAIL IF NOT MET
|
|
# E.G. lynx and sed
|
|
#
|
|
|
|
# CLEAN UP SLOPPY VARIATION IN USE OF ls VS find
|
|
# * ls MIGHT BE FINE IN ALL CASES; BUT DOES ONE OR THE OTHER HAVE
|
|
# A HIGHER MEMORY OR CPU DRAW?
|
|
|
|
# ALLOW USERS TO ADD KEYWORDS TO OTHER USERS' LINKS. THIS WOULD BE
|
|
# A LOT EASIER IN A sqlite VERSION. CURRENTLY, ALL WRITE ACTIONS CAN
|
|
# BE TRACED TO THE USER WHO MADE THEM. IF USERS WERE ALLOWED TO
|
|
# APPEND KEYWORDS TO THE date-username/keywords FILE, THEN YOU
|
|
# WILL NEVER KNOW WHO ADDED #PEEPEE #POOPOO TO EVERY SINGLE LINK
|
|
# IN THE FILES...
|
|
#
|
|
|
|
#####################################################################
|
|
## CONFIGURATION VARIABLES
|
|
|
|
CONFIG_FILE="/home/cmccabe/code/linkulator/config"
|
|
|
|
[ ! -f "$CONFIG_FILE" ] && {
|
|
printf "Unable to find config file %s\n" "${CONFIG_FILE}"
|
|
exit 1
|
|
}
|
|
|
|
. "${CONFIG_FILE}"
|
|
|
|
#####################################################################
|
|
## SUBROUTINES
|
|
|
|
START_UP_CHECKS () { ## START UP AND SETUP
|
|
|
|
cd "/" || {
|
|
printf "unable to chdir /"
|
|
exit 1
|
|
}
|
|
## THIS PREVENTS A 'find' HICCUP WHEN THE SCRIPT IS CALLED
|
|
## FROM WITHIN A DIRECTORY TO WHICH THE SETUID USER DOESN'T
|
|
## HAVE ACCESS
|
|
|
|
if [ ! -d "$FILEPATH" ]; then
|
|
printf "%s does not exist.\n" $FILEPATH
|
|
printf "Create it? [y/n] anything else aborts:"
|
|
read -r RESP
|
|
case "$RESP" in
|
|
y | Y ) mkdir -p "$FILEPATH"
|
|
if [ ! -d "$FILEPATH" ]; then
|
|
printf "\nERROR: Could not create $s.\n" $FILEPATH
|
|
printf "Check that you have privileges to that location.\n"
|
|
printf "Goodbye.\n"
|
|
else
|
|
printf "\nOk. Created. Goodbye."
|
|
fi
|
|
;;
|
|
* ) printf "Not creating %s. Goodbye." $FILEPATH
|
|
;;
|
|
esac
|
|
exit 0
|
|
fi
|
|
|
|
USER="$(id -run)" # BECAUSE A USER COULD OTHERWISE SET $USER TO
|
|
# ANY ARBITRARY VALUE (like rm -rf /yo/mama).
|
|
# AND, -run GETS REAL USERNAME, NOT EFFECTIVE.
|
|
GOPHER="n" # WILL CHANGE TO "y" LATER IF $1="g"
|
|
|
|
}
|
|
|
|
START_UP_CHECKS
|
|
|
|
PRINT_GOPHER_VERSION () {
|
|
printf 'RECENT LINKS POSTED TO LINKULATOR ON THE ZAIBATSU\n'\
|
|
' Printed on %s\n'\
|
|
' ~ log in for discussion on these links ~'\
|
|
"$(date +'%Y/%m/%d')"
|
|
BROWSE_LINKS
|
|
}
|
|
|
|
ADD_NEW_LINK () {
|
|
## GET AND VALIDATE USER INPUT, AND SAVE TO FILE
|
|
|
|
TITLE=""
|
|
DESCRIPTION=""
|
|
KEYWORDS=""
|
|
|
|
LINK="$1" ## FILL W/COMMAND LINE ARGUMENT IF SUPPLIED
|
|
|
|
while true; do
|
|
while true; do
|
|
## (KatolaZ) this needs to be thought through a bit more....
|
|
echo -e "\nEnter/Edit your link for submission (or cancel with a blank line): "
|
|
read -i "$LINK" -e LINK # -i = default text, -e = allow line editing
|
|
|
|
# GREATER THAN $URL_MIN_LEN
|
|
# LESS THAN $URL_MAX_LEN
|
|
# REJECT ON MALFORMED URL (NEED TO DEFINE AND RECOGNIZE THIS)
|
|
|
|
if [ "${#LINK}" -eq "0" ]; then
|
|
printf "\nLink submission canceled.\n"
|
|
BROWSE_LINKS
|
|
return
|
|
elif [ "${#LINK}" -le "$URL_MAX_LEN" ] && [ "${#LINK}" -ge "$URL_MIN_LEN" ]; then
|
|
## WHY STILL USE -ge? BECAUSE $URL_MIN_LEN MIGHT NOT BE ZERO.
|
|
break;
|
|
else
|
|
printf "\nError: Link URL must be less than %s and greater than %s characters." "$URL_MAX_LEN" "$URL_MIN_LEN"
|
|
LINK='z'
|
|
fi
|
|
|
|
## TODO: ADD VALIDATION OF LINK (HTTP SOMEWHAT POSSIBLE,
|
|
## BUT WHAT ABOUT GOPHER OR SSH?)
|
|
|
|
done
|
|
|
|
while true; do
|
|
# GREATER THAN $TITLE_MIN_LEN
|
|
# LESS THAN $TITLE_MAX_LEN
|
|
printf "\nEnter/Edit the short title of this link: "
|
|
## (KatolaZ) maybe the read below should be rethought?
|
|
read -i "$TITLE" -e TITLE
|
|
if [ "${#TITLE}" -le "$TITLE_MAX_LEN" ] && [ "${#TITLE}" -ge "$TITLE_MIN_LEN" ]; then
|
|
break;
|
|
else
|
|
printf "\nERROR: Link title must be less than %s and greater than %s characters." "${TITLE_MAX_LEN}" "${TITLE_MIN_LEN}"
|
|
fi
|
|
done
|
|
|
|
while true; do
|
|
# LESS THAN $DESC_MAX_LEN; ZERO LEN IS OK
|
|
printf "\nEnter a short, one-liner description (< %s chars) of the link: " "${DESC_MAX_LEN}"
|
|
read -i "$DESCRIPTION" -e DESCRIPTION
|
|
|
|
if [ "${#DESCRIPTION}" -le "${DESC_MAX_LEN}" ]; then
|
|
break;
|
|
else
|
|
printf "\nERROR: One-liner description must be less than %s characters." "${DESC_MAX_LEN}"
|
|
fi
|
|
done
|
|
|
|
while true; do
|
|
printf "\nEnter zero or more keywords (alphanumeric and dashes only) separated by spaces: "
|
|
## (KatolaZ) -- same as above -- KEYWORDS will be empty here, right?
|
|
read -i "$KEYWORDS" -e KEYWORDS
|
|
|
|
## TODO: KEYWORDS SHOULDN'T START OR END WITH DASHES OR HAVE MORE THAN
|
|
## ONE DASH IN A ROW.
|
|
## CHECK FOR NON-ALPHANUM + DASH:
|
|
KEYWORDS2="$(echo "$KEYWORDS" | tr -cd '[a-zA-Z0-9\-\ ]\n')"
|
|
# (KatolaZ) do you really nead an array here? why not a string instead?
|
|
KW_ARRAY=($KEYWORDS)
|
|
KW_LONGEST=0
|
|
for i in "${KW_ARRAY}"; do
|
|
if [ "${#i}" -ge "${KW_LONGEST}" ]; then
|
|
KW_LONGEST="${#i}"
|
|
fi
|
|
done
|
|
if [ "$KEYWORDS2" != "$KEYWORDS" ] || [ "${#KEYWORDS}" -gt "${KEYWORDS_MAX_LEN}" ]; then
|
|
printf "\nERROR: Keywords may only characters A-Z and dashes and no "\
|
|
'longer than %s.' "${KEYWORDS_MAX_LEN}"
|
|
elif [ "${KW_LONGEST}" -gt "${KEYWORD_MAX_LEN}" ]; then
|
|
printf "\nERROR: Individual keywords must be less than %s characters." $"${KEYWORD_MAX_LEN}"
|
|
else
|
|
break
|
|
fi
|
|
done
|
|
|
|
echo "${HORZ_RULE}"
|
|
printf "Your link submission:\n"
|
|
printf "LINK: %s\n" "$LINK"
|
|
printf "TITLE: %s\n" "$TITLE"
|
|
printf "DESCRIPTION: %s\n" "$DESCRIPTION"
|
|
printf "KEYWORDS: %s\n" "$KEYWORDS"
|
|
# "$LINK" "$TITLE" "$DESCRIPTION" "$KEYWORDS"
|
|
|
|
echo "${HORZ_RULE}"
|
|
SAVED='no'
|
|
# (KatolaZ) rethink the read below?
|
|
read -n1 -p "[s]ave, [e]dit, [d]iscard? (anything else aborts): " ADD_NEW_DECISION
|
|
case "${ADD_NEW_DECISION}" in
|
|
s ) SAVED="yes"
|
|
break
|
|
;;
|
|
e ) printf '\n'
|
|
break
|
|
;;
|
|
* ) break ;; ## 'd' IS SAME AS ABORT
|
|
esac
|
|
done
|
|
|
|
if [ "$SAVED" = "yes" ]; then
|
|
# make dir name
|
|
POSTID="$(date +%s)-${USER}"
|
|
DIRNAME="${FILEPATH}/${POSTID}"
|
|
mkdir -p "${DIRNAME}"
|
|
|
|
# make link file name, 4 lines: link, title, desc, keywords
|
|
## (KatolaZ) I am not sure about the logic below...
|
|
touch "${DIRNAME}/link" && printf "%s\n" "$LINK" > "${DIRNAME}/link"
|
|
# REPLY TO LINKS
|
|
|
|
touch "${DIRNAME}/title" && printf "%s\n" "$TITLE" > "${DIRNAME}/title"
|
|
touch "${DIRNAME}/description" && printf "%s\n" "$DESCRIPTION" > "${DIRNAME}/description"
|
|
touch "${DIRNAME}/keywords" && printf "%s\n" "$KEYWORDS" > "${DIRNAME}/keywords"
|
|
|
|
printf "\nYour link post was added.\n"
|
|
fi
|
|
|
|
while true; do
|
|
# (KatolaZ) Maybe rethink the logic below?
|
|
read -n1 -p "[b]rowse links, or [q]uit? " TEMP_FILE_DECISION
|
|
echo
|
|
case "${TEMP_FILE_DECISION}" in
|
|
b ) BROWSE_LINKS
|
|
;;
|
|
q ) break
|
|
;;
|
|
* ) printf "\nError. Please answer 'b' or 'q'."
|
|
;;
|
|
esac
|
|
done
|
|
|
|
}
|
|
|
|
|
|
BROWSE_LINKS () {
|
|
|
|
if [ ! -z ${1+x} ]; then ## KEYWORD SEARCH
|
|
KW=$1
|
|
FILES=$(find $FILEPATH -name "keywords" | xargs grep -i $KW | awk -F "/keywords:" '{print $1}')
|
|
FILES=($FILES)
|
|
echo "Searching by keyword \"$KW\"... ${#FILES[@]} total results."
|
|
else ## NORMAL BROWSE MODE
|
|
FILES=$(ls -dt $FILEPATH*)
|
|
FILES=($FILES)
|
|
echo "All links. Sorted by most recent touched."
|
|
fi
|
|
|
|
UB=${#FILES[@]}
|
|
UB=$((UB-1)) ## B/C IT'S ONE LESS THAN NUM ELEMENTS
|
|
|
|
if [[ $FILE_MANAGEMENT_MODE = "auto-hide" ]] && [[ $UB -gt $BROWSE_DISPLAY_MAX ]]; then
|
|
## TRIM RESULTS DOWN TO THE auto-hide MAX:
|
|
FILES=("${FILES[@]::$BROWSE_DISPLAY_MAX}")
|
|
fi
|
|
|
|
if [[ $GOPHER = "y" ]]; then
|
|
echo
|
|
|
|
for i in ${FILES[@]}; do
|
|
i=${i//\/keywords/} ## TRIM /keywords OFF FILEPATH
|
|
POSTER_NAME=${i##*-} ## USERNAME OF PERSON WHO POSTED LINK
|
|
# vv HOLY COW THIS DATE GETTING/CONVERTING IS KLUDGEY. I FEEL ICKY.
|
|
BN=$(basename $i)
|
|
BN=$(echo $BN | cut -d- -f1)
|
|
CREATE_DATE=$(date -d @${BN} +"%Y-%m-%d")
|
|
NUM_REPLIES=$(find $i/. -name '[0-9]*' | wc -l)
|
|
TITLE_TEXT=$(cat $i/title | cut -c1-$MAX_TITLE_LEN)
|
|
|
|
echo "TITLE: " $TITLE_TEXT
|
|
echo "URL: " $(cat $i/link)
|
|
echo "DESCRIPTION: " $(cat $i/description)
|
|
echo "POSTED BY: $POSTER_NAME ON $CREATE_DATE"
|
|
echo "-----"
|
|
# printf "%4.4s" $LINKCOUNT
|
|
done
|
|
|
|
exit
|
|
else
|
|
## TODO: IMPROVE OUTPUT FORMAT OF ID, USER, DATE, TITLE, NUM-REPLIES
|
|
RED='\033[0;31m'
|
|
NC='\033[0m' # NO COLOR
|
|
|
|
echo $HORZ_RULE
|
|
|
|
printf "%4.4s" "ID#"
|
|
printf "%12.12s" " Date "
|
|
printf "%18.18s" "Name [#re:] "
|
|
printf " Link Title"
|
|
echo
|
|
|
|
LINKCOUNT=0
|
|
echo $HORZ_RULE
|
|
for i in ${FILES[@]}; do
|
|
i=${i//\/keywords/} ## TRIM /keywords OFF FILEPATH
|
|
POSTER_NAME=${i##*-} ## USERNAME OF PERSON WHO POSTED LINK
|
|
# vv HOLY COW THIS DATE GETTING/CONVERTING IS KLUDGEY. I FEEL ICKY.
|
|
BN=$(basename $i)
|
|
BN=$(echo $BN | cut -d- -f1)
|
|
CREATE_DATE=$(date -d @${BN} +"%Y-%m-%d")
|
|
NUM_REPLIES=$(find $i/. -name '[0-9]*' | wc -l)
|
|
TITLE_TEXT=$(cat $i/title | cut -c1-$MAX_TITLE_LEN)
|
|
printf "%4.4s" $LINKCOUNT
|
|
printf " $CREATE_DATE "
|
|
printf "${RED}"
|
|
printf "%12.12s" $POSTER_NAME
|
|
printf "${NC}"
|
|
printf " ["
|
|
printf "%2.2s" $NUM_REPLIES
|
|
printf "] $TITLE_TEXT\n"
|
|
LINKCOUNT=$((LINKCOUNT+1))
|
|
done
|
|
fi
|
|
|
|
while true; do
|
|
echo $HORZ_RULE
|
|
if [[ ${#FILES[@]} -gt 0 ]]; then
|
|
read -p "[a]dd link, [b]rowse all, [k]eyword search, [q]uit, or link [#ID] number: " REPLY
|
|
else
|
|
read -p "[a]dd link, [b]rowse all, [k]eyword search, [q]uit: " REPLY
|
|
fi
|
|
echo
|
|
|
|
UB=${#FILES[@]}
|
|
UB=$((UB-1)) ## B/C IT'S ONE LESS THAN NUM ELEMENTS
|
|
if [[ ${REPLY} =~ ^[0-9]+$ && ${REPLY} -ge 0 && ${REPLY} -le $UB ]]; then
|
|
DIR_PATH=${FILES[${REPLY}]}
|
|
VIEW_LINK_DETAIL $DIR_PATH
|
|
break
|
|
elif [ "${REPLY}" = "a" ]; then
|
|
ADD_NEW_LINK; break ;
|
|
elif [ "${REPLY}" = "b" ]; then
|
|
BROWSE_LINKS
|
|
elif [ "${REPLY}" = "r" ]; then
|
|
BROWSE_LINKS; break;
|
|
elif [ "${REPLY}" = "k" ]; then
|
|
SEARCH; break ;
|
|
elif [ "${REPLY}" = "p" ]; then
|
|
echo "prev link function not working"
|
|
break ;
|
|
elif [ "${REPLY}" = "n" ]; then
|
|
echo "next link function not working"
|
|
break ;
|
|
elif [ "${REPLY}" = "q" ]; then
|
|
break 2 ;
|
|
else
|
|
if [[ ${#FILES[@]} -gt 0 ]]; then
|
|
echo -e "\nERROR: Valid options are 'a', 'b', 'q' or a link number.";
|
|
else
|
|
echo -e "\nERROR: Valid options are 'a', 'b' or 'q'.";
|
|
fi
|
|
fi
|
|
done
|
|
}
|
|
|
|
VIEW_LINK_DETAIL () {
|
|
|
|
LINK=$(cat $1/link)
|
|
SHOW_REPLIES=$2
|
|
POSTER_NAME=${1##*-} ## USERNAME OF PERSON WHO POSTED LINK
|
|
|
|
BN=$(basename $i)
|
|
BN=$(echo $BN | cut -d- -f1)
|
|
CREATE_DATE=$(date -d @${BN} +"%Y-%m-%d")
|
|
|
|
echo "Title: $(cat $1/title)"
|
|
echo "Link: $(cat $1/link)"
|
|
echo "Description: $(cat $1/description)"
|
|
echo "Keywords: $(cat $1/keywords)"
|
|
echo "Posted by: ${POSTER_NAME} on ${CREATE_DATE}"
|
|
|
|
shopt -s extglob ## MOVE UP TO FILE HEADER
|
|
|
|
## TODO: LIST REPLIES AND ALLOW VIEWING OF EACH
|
|
if [[ $SHOW_REPLIES = "y" ]]; then
|
|
unset REPLIES
|
|
REPLIES=$(find $1 -type f -name '[0-9]*')
|
|
IFS=$'\n' REPLIES=($(sort <<<"${REPLIES[*]}")); unset IFS
|
|
# REPLIES=$(ls -rft $1/[0-9]*) ## <<- BUG HERE, WHEN LS RETURNS ZERO RESULTS.
|
|
REPLIES=(${REPLIES[@]})
|
|
TOT_REPLIES=${#REPLIES[@]}
|
|
if [[ "$TOT_REPLIES" -eq "0" ]]; then
|
|
echo -e "\nThis link does not yet have replies. Be the first - type 'r'!\n";
|
|
fi
|
|
for i in ${REPLIES[@]}; do
|
|
FN=$(basename $i)
|
|
REPLY_USER=$(echo $FN | sed 's/^\([0-9]\)*//')
|
|
REPLY_USER=$(echo $REPLY_USER | sed 's/^\(-\)*//')
|
|
REPLY_DATE=$(stat -c %y $i | cut -d" " -f1)
|
|
echo $HORZ_RULE
|
|
echo "$REPLY_DATE $REPLY_USER:"
|
|
echo $(cat $i)
|
|
done
|
|
fi
|
|
|
|
while true; do
|
|
echo $HORZ_RULE
|
|
read -n1 -p "view in [l]ynx, [s]how replies, [r]eply, [b]rowse links, or [q]uit? " REPLY
|
|
echo; echo $HORZ_RULE
|
|
case $REPLY in
|
|
## l ) $DEFAULT_BROWSER "$LINK" ;;
|
|
l ) if [[ $LINK == http://* ]] || [[ $LINK == https://* ]] || [[ $LINK == gopher://* ]]; then
|
|
$DEFAULT_BROWSER "$LINK"
|
|
else
|
|
echo "SORRY! $PROGNAME can only view URLs prefixed with http:// https:// or gopher://"
|
|
fi
|
|
;;
|
|
s ) VIEW_LINK_DETAIL $1 "y" ; break;;
|
|
r ) COMMENT_ON_LINK $1 ; break ;;
|
|
b ) BROWSE_LINKS; break ;;
|
|
q ) echo -e "\n"; exit ;;
|
|
* ) echo -e "\nError. Please answer 'v', 'w', 'r', 'b', or 'q'." ;;
|
|
esac
|
|
done
|
|
|
|
}
|
|
|
|
COMMENT_ON_LINK () {
|
|
|
|
FILE_DIR=$1
|
|
# RANDSTR=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 8 | head -n 1)
|
|
# TEMP_FN=LINK8_$RANDSTR.tmp
|
|
#
|
|
# TEMP_FN=/tmp/$TEMP_FN
|
|
# FILE_DIR=$1
|
|
# $DEFAULT_EDITOR $TEMP_FN
|
|
# ## USER MAY CANCEL EDIT WITHOUT SAVING, THEREBY NOT CREATING THE FILE
|
|
# if [[ -f "$TEMP_FN" ]]; then
|
|
# cat $TEMP_FN
|
|
|
|
|
|
|
|
while true; do
|
|
# LESS THAN $COMMENT_MAX_LEN
|
|
echo -e "\n\nType your comment about this link (\"Enter\" submits, blank line aborts):"
|
|
read -i "$COMMENT" -e COMMENT
|
|
|
|
if [[ ${#COMMENT} -le $COMMENT_MAX_LEN ]] && [[ ${#COMMENT} -gt 0 ]]; then
|
|
read -n1 -p "[s]ave, [e]dit, [d]iscard? (anything else aborts) " SAVE_COMMENT_DECISION
|
|
echo
|
|
case $SAVE_COMMENT_DECISION in
|
|
s ) REPLY_ID=$(date +%s)-$USER
|
|
touch $FILE_DIR/$REPLY_ID
|
|
echo $COMMENT > $FILE_DIR/$REPLY_ID
|
|
unset COMMENT
|
|
echo "Reply saved."; echo;
|
|
break ;;
|
|
e ) : ;; # EDIT
|
|
* ) unset COMMENT; echo; break ;; ## 'd' IS SAME AS ABORT
|
|
esac
|
|
elif [[ ${#COMMENT} -le 0 ]]; then
|
|
echo "Reply aborted."
|
|
break;
|
|
else
|
|
echo -e "\nERROR: Comment must be less than $COMMENT_MAX_LEN characters."
|
|
fi
|
|
done
|
|
|
|
|
|
# while true; do
|
|
# read -n1 -p "[s]ave? [e]dit? [d]iscard? " TEMP_FILE_DECISION
|
|
# case $TEMP_FILE_DECISION in
|
|
# s ) cp $TEMP_FN $FILE_DIR/$REPLY_ID;
|
|
# rm $TEMP_FN
|
|
# echo -e "\nReply saved. Thank you."
|
|
# break ;;
|
|
# e ) ${EDITOR: -$DEFAULT_EDITOR} $TEMP_FN ;;
|
|
# d ) rm $TEMP_FN; break ;;
|
|
# * ) echo -e "\nError. Please answer 's', 'e' or 'd'.";;
|
|
# esac
|
|
# done
|
|
# else
|
|
# echo "Reply aborted."
|
|
# fi
|
|
|
|
## - TAKE COMMENT DIRECTORY AS ARGUMENT
|
|
## - IF DIRECTORY NO LONGER EXISTS, "ERROR: SORRY, THE TARGET
|
|
## COMMENT HAS CHANGED SINCE YOU BEGAN RESPONDING AND IT NO
|
|
## LONGER EXISTS"
|
|
## - CREATE NEW FILE NAME JUST BEFORE SAVING, TO AVOID COLLISIONS
|
|
|
|
VIEW_LINK_DETAIL $1 "y"
|
|
}
|
|
|
|
#CHECK_IF_FILE_EXISTS () {
|
|
## INTENDED TO VALIDATE THAT A LINK POINTS TO AN EXISTING RESOURCE
|
|
## WITHOUT HAVING TO DOWNLOAD THAT RESOURCE.
|
|
## WORKS WITH WWW SERVERS THAT RESPOND WITH HTTP/1.1 2.00 OK, BUT
|
|
## NOT ALL DO (E.G. CIRCUMLUNAR.SPACE), SO NOT A COMPLETE SOLUTION.
|
|
## ALSO, DOES NOT WORK WITH GOPHER LINKS. MAYBE THIS FUNCTION IS NOT
|
|
## FEASIBLE...
|
|
#
|
|
# FILE_EXISTS=false
|
|
# if [[ `wget -S --spider $LINK 2>&1 | grep 'HTTP/1.1 200 OK'` ]];
|
|
# then FILE_EXISTS=true;
|
|
# fi
|
|
#}
|
|
|
|
CLEANUP () {
|
|
## REMOVE OLD LINKS, ACCORDING TO CONFIGURATION VARIABLE SETTINGS..
|
|
echo "Clean up subroutine not yet written..."
|
|
}
|
|
|
|
SEARCH () {
|
|
read -p "Enter a keyword to search (leave blank to abort): " REPLY
|
|
## TODO: VALIDATE KEYWORD INPUT (ONE WORD, ALPHANUM + DASHES?)
|
|
if [[ ${#REPLY} -ge 1 ]]; then
|
|
BROWSE_LINKS $REPLY
|
|
else
|
|
echo -e "\nSearch aborted."
|
|
BROWSE_LINKS
|
|
fi
|
|
}
|
|
|
|
HELP () {
|
|
## PRINT PROGRAM HELP AND ABOUT INFO
|
|
echo $HORZ_RULE
|
|
echo
|
|
echo "$PROGNAME help and about:"
|
|
echo "Author(s): $AUTHORS, Last Code Update: $LASTCOMMIT"
|
|
echo
|
|
echo "$PROGNAME is a super-lightweight, shell-only link aggregator"
|
|
echo "for small, trusting shell communities like public access "
|
|
echo "Unix/GNU/Linux systems or tilde-boxes."
|
|
echo
|
|
echo "You can add or browse gopher or www links, or comment on links."
|
|
echo "Quit with [q] most times, or ctrl-c at any time."
|
|
echo
|
|
echo "Found a bug or have a feature request?"
|
|
echo " --> Send email to $LEAD_DEV_EMAIL."
|
|
echo
|
|
}
|
|
|
|
#####################################################################
|
|
## MAIN PROGRAM FLOW CONTROL:
|
|
|
|
## RESPOND ACCORDING TO PROGRAM MODE (COMMAND LINE ARGS):
|
|
if [[ $1 = "g" ]]; then
|
|
GOPHER="y"
|
|
PRINT_GOPHER_VERSION; exit
|
|
elif [[ $1 = "a" ]]; then
|
|
ADD_NEW_LINK $2; exit
|
|
elif [[ $1 = "h" ]]; then
|
|
HELP; exit
|
|
elif [[ $1 = "k" ]]; then
|
|
echo -e "\nKeyword subsetting not yet implemented... :(\n"; exit
|
|
else
|
|
figlet $PROGNAME
|
|
BROWSE_LINKS
|
|
fi
|
|
|
|
echo -e "\n\nThank you for using $PROGNAME. Please come again.\n"
|
|
exit
|
|
|
|
# ==*^^^^^^^^>>>>>
|
|
# // \\
|
|
# This is Linky, the link aggregator gator.
|